[swift-evolution] [Meta] Let's talk TouchBar + Unicode

Jonathan Hull jhull at gbis.com
Mon Oct 31 17:45:54 CDT 2016


Premature standardization is just as bad as premature optimization...

> Because your arguments are assuming that the user can’t figure things out, when they totally can. It is an extremely common (almost subconscious) caricature of “The User”.  As a young designer, I was yelled at once by Don Norman for doing the same thing.  I was complaining about the design of a room key (The arrow on the key needed more salience to afford how to use it).  He replied, “Sure, it could be better. Things can always be better… but were you able to get into your room?”.  Yes, I had.  He then asked several other people nearby if they were able to get into their rooms.  Everyone had.  “How many people in the entire hotel do you think were not able to get into their room because of this?  Tell me again why the hotel should spend thousands of dollars to reprint their keys to solve this thing which may be an annoyance, but hasn’t caused an actual problem?”.  He then told me that people were much more capable than I was imagining them to be, and that I should give them the credit they deserve because failing to do so is ultimately harmful not just to the user of my designs, but to the design profession in general.  It took me several years to really truly understand what he meant… but he was (as usual) right.
> 
> Perhaps he was right about your room key. But we're not designing room keys here. (Also, have you *never* been annoyed by room keys and wondered to yourself who the !@#$ designed that piece of crap? But I digress.)

Sorry if I was unclear.  The point is not that you shouldn’t deal with annoyances (you should improve the keys on the next already scheduled re-print).  The point was that I was imagining the problem to be much worse (and much more urgent) than it actually was, because I wasn’t realistic about the user’s capabilities.  It may be counter-intuitive, but it leads to worse designs overall and lack of trust in the designer (and designers in general) by the client.  It also blinded me to much bigger improvements that could be made to truly improve the guest’s stay.


> Figure out how to use your room key and the task is done. You're in your room. Figure out how to type and you're at step 0 of a long list of steps towards writing anything remotely useful in Swift. Make every step slightly annoying and you've got an infuriating language.

Why do you imagine that it has to be annoying or difficult?  You already told me that you find image literals “beginner-friendly”, and this could have a similar UX. It is not taking away any capability you have now, just adding the ability to use symbols much more easily when you want to.


>  I was being slightly hyperbolic, but you are arguing that figuring out how to use autocomplete (or even use the option key to type a symbol) is too difficult for beginners.
> 
> I'm not arguing that it's "too difficult" a challenge to surmount--I'm arguing that if we stick to ASCII the challenge would not exist in the first place, and that we should not challenge the user to any degree whatsoever with regards to typing. There are other, more salient challenges already.
>  
> You have also argued that having both ‘formsUnion’ and the union symbol would be too much of a burden on someone learning the language.  You are assuming they will fail, no matter how well the UI is designed.  I am saying, there might be a brief moment where they are learning, but once they get it… they will get it… and they will be able to express themselves much more as a result.
> 
> Also, I don’t think I have ever talked about requiring anyone to figure out the symbols.  Rather I have talked about building on-ramps to teach them to be able to use the symbols freely where they want to.
> 
> I understand. There's a mini-argument here that I didn't write out. It's been said that Swift is an "opinionated" language, not a to-each-their-own/design-by-committee kind of language. Thus, afaict, the modus operandi is that what's decided to be the best way is adopted as the *only* way.
> 
> Other than for compatibility reasons, perhaps, there are few if any aliases in the standard library. If we decide that the less-than-or-equal-to symbol is the best way to invoke that particular operation, then `<=` would be deprecated and then in the next version removed. If that's too much to stomach, then it's a hint that perhaps the less-than-or-equal-to symbol isn't good enough to replace `<=`. I don't see room for an in-between solution where the same function is named two or three ways.

If this is true, I would argue for ≤ (with <= bringing up ≤ in autocomplete). I would also probably argue for a longer depreciation schedule.  We can argue about that in phase 2 though.

I am actually a little annoyed at the removal of appending(contentsOf:), because I used both that and ‘+’, finding one or the other clearer in different contexts.  I’ll get used to it eventually though.

> I actually just finished teaching a Swift class for people who had never programmed before.  They had trouble with <= (“Why does the arrow mean less than or equal? Why can’t I use ≤?") and != (“But the ! has to do with optionals”… I think they thought it was unwrap and assign like *= is multiply and assign).  They were able to solve these issues, of course, as they are intelligent people… but let’s not pretend that ASCII is magically free of confusion.
> 
> If only you'd read as many documents as I have where the less-than-or-equal-to sign is written by underlining <, you would see that your proposed solution to these issues is not nearly as obvious as you make it out to be. (I'm a biologist by training; the number of manuscripts in that field where one sees it written as "<" [note to non-rich-text readers: that's "<" with an underline] is very near 100%.) 

Just because Microsoft Word is horrible and unintuitive, doesn’t mean that we have to be (also note that they were trying to make the ≤ symbol as opposed to writing <= or something similar).  The point is that we can do better than the status quo… and we should.


> Having fewer available symbols means we are forced to chain the symbols we do have together in suboptimal ways (I believe there is talk of adding <=> to the standard library).
> 
> 
>> Let’s take, as an example, discovery of “formUnion”.  How will a user, who doesn’t know about this, discover it’s existence and how it works?
>> 
>> • For the lucky people who already know about unions, but not swift’s crazy “formUnion”, they are in luck.  If they just start typing ‘uni…’, then ‘formUnion’ will show in the autocomplete.  Hmm… sounds like the exact method I was talking about with symbols.
>> 
>> • Maybe they will command-click into the definition file, and then see formUnion definition there.  But couldn’t they also see the union symbol defined there?
>> 
>> • Maybe they search for the documentation and find “formUnion” explained on the page. Again, the same is true of the operator.
>> 
>> OK, and if you've never seen that symbol before, just how do you say "∪"? As in, literally, how would someone who just discovered such a hypothetical operator in Swift's documentation for SetAlgebra turn to another programmer and pronounce it? Here's the description in the documentation: "Adds the elements of the given set to the set." Not very helpful for figuring that out, is it?
> 
> In Xcode, they could mouse over it and see the word “union”.  In the documentation, it could say “Union Operator. Adds the elements of the given set to the set.”  It could even have a hint on how to type it (e.g. '^union’), and that part could be auto-generated.
> 
> But if it needs to be documented as "union operator" and needs a tooltip to say "union" and you need to type "^union", why wouldn't you just name it "union" _as it already is_?

Because it is much more concise.  It is essentially “named” union… we just have a shorthand way to display it that doesn’t have to distract as much from everything else.  Would you rather have: '2.formSum(2)' or ‘2 + 2’ ?  It makes a REAL difference when you are combining things into a fairly complex equation.  One line vs. half a page.  

The recent update from named methods to symbols in Decimal made my code so much more readable and clear.  The transformation is remarkable.  I can understand at a glance what I had to read and comprehend before.  Why should we deny ourselves similar clarity for set, vector, or matrix operations?


>> We do need to be aware of and support beginning users, but optimizing Swift (or any expert system) for beginners just leads to a disempowering experience for everyone.
>> 
>> Disagree strongly. Easy things should be easy, even when they're part of complex systems. Not everyone learning Swift has to grasp copy-on-write on day 1, but no one should have to struggle with how to type something.
> 
> I do agree that easy things should be easy. I am saying we can make this easy if we want to.  Easy enough that it can be used in any Swift code, even the core libraries when appropriate. 
> 
> Again, if a symbol isn’t clear, we shouldn’t use it (and we can argue over what is clear or not in other threads). We should use whatever is clearest and most expressive. If that is ASCII, that is great, and if it is unicode, also great.
> 
> I'm arguing that any ASCII character is at baseline, simply by virtue of having the increased recognition that ASCII characters do, clearer than any non-ASCII character to a general audience. There would have to be a huge win in expressiveness for any particular non-ASCII character to overcome that handicap. And I'm arguing that the most plausible scenario in the standard library where we might see a huge win--the union operator--does not, for me, pass that bar.

Well, if on a case-by-case basis, we end up choosing ASCII characters and words, that is fine. I am just saying that we shouldn’t have a rule that limits us from choosing the best option in the case we decide it is the unicode version.


> In the case of the menu item, it is used to teach an accelerator which saves time.  I am saying, we can use the same technique to teach symbols which are more concise, are more easily located in code, and are more representative.  If the symbol is less representative than the word in a particular case, then we should use the word.  We should use whatever is best.
> 
> Yes, and I'm saying, in the standard library, what's best isn't the symbol.

In most cases, the word is probably clearest.  But we decided ‘+’ (which is a symbol) is more clear than ‘formSum()’.  As I said above, I found a huge real world improvement from moving to symbols with Decimal.  I am sure that we will find other cases as well where symbols are helpful, especially as we start bringing in both heavy math (like vectors and matrices) and advanced string processing.


> Again, this has nothing to do with how you write your own third-party libraries.
> 
> Why should we force ourselves to use something, which is by definition, less clear/expressive?
> 
> See above. I can't accept your premise that the clearest and most expressive choice for any standard library API would be a non-ASCII symbol.

Right, and that is something we can argue given the context of each issue.  I can’t accept the premise that you know FOR SURE that there aren’t cases where something else might be the best choice.


If we stop ourselves from even CONSIDERING other options, then I can tell you we won’t find the best one.  In my experience, productive creativity requires two phases:

1) Ideation - Where you generate lots of ideas (most of which are unworkable).  I tell my students that the road to GREAT ideas runs through bad ideas, so don’t stop with ideas that are merely ok.

2) Pruning - Where you take all of those ideas and nurture and sculpt the good ones… picking just the one or two that work best.

I have found this list to be downright hostile at times towards anyone attempting step 1, and I think it is hurting us.  It may work for iterative refinement, but it makes big leaps much harder, so we get stuck in local maxima (and bike shedding).


When I was little, I was lucky enough to meet Buzz Aldrin, and he told me: “Reach for the stars, and you might just touch the sky”.  I was confused, and asked him what it meant.  He said, “You need to aim high, because you will surprise yourself with what you are capable of (even if you don’t reach your goal) and you will do better than you thought you ever could.  If you aim for your actual goal, you will never actually reach it.”  That stuck with me my whole life.

All I am saying is that we need to stop imagining that everything is too difficult (for ourselves and for “The User”). Let’s figure out what we would want in an ideal world, and then we can figure out how to make some approximation work in our own...



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-evolution/attachments/20161031/babc16ce/attachment.html>


More information about the swift-evolution mailing list