[swift-evolution] Strings in Swift 4
David Waite
david at alkaline-solutions.com
Tue Feb 7 15:07:30 CST 2017
> On Feb 7, 2017, at 1:19 PM, Ted F.A. van Gaalen <tedvgiosdev at gmail.com <mailto:tedvgiosdev at gmail.com>> wrote:
>
>>
>> That proves nothing, though. The fact that people are using integers to
>> do this doesn't mean you need to use them, nor does it mean that you'll
>> get the right results from doing so. Typically examples that use
>> integer constants with strings are wrong for some large proportion of
>> unicode text.
>>
> This is all a bit confusing.
> in https://en.wiktionary.org/wiki/glyph <https://en.wiktionary.org/wiki/glyph>
> Definition of a glyph in our context:
> (typography, computing) A visual representation of a letter <https://en.wiktionary.org/wiki/letter>, character <https://en.wiktionary.org/wiki/character>, or symbol <https://en.wiktionary.org/wiki/symbol>, in a specific font <https://en.wiktionary.org/wiki/font> and style <https://en.wiktionary.org/wiki/style>.
>
> I now assume that:
> 1. -= a “plain” Unicode character (codepoint?) can result in one glyph.=-
> 2. -= a grapheme cluster always results in just a single glyph, true? =-
> 3. The only thing that I can see on screen or print are glyphs (“carvings”,visual elements that stand on their own )
> 4. In this context, a glyph is a humanly recognisable visual form of a character,
> 5. On this level (the glyph, what I can see as a user) it is not relevant and also not detectable
> with how many Unicode scalars (codepoints ?), grapheme, or even on what kind
> of encoding the glyph was based upon.
>
> is this correct? (especially 1 and 2)
>
> Based on these assumptions, to me then, the definition of a character == glyph.
> Therefore, my working model: I see a row of characters as a row of glyphs,
> which are discrete autonomous visual elements, ergo:
> Each element is individually addressable with integers (ordinal)
While this is true, the encoding(s) have focused on memory size, operational performance, and compatibility with existing tools over the ability to perform integer-indexed random access. Even a Character in Swift that would be returned via random access is effectively a substring, due to the value being of arbitrary size.
In my experience, many languages and thus many developers use strings both as text and as data. This is even more common in scripting languages, where you don’t have nil-termination to contend with for storing binary data within a ‘string’.
I assume Swift has taken the standpoint that text cannot be handled properly or safely unless it is considered distinct from data, and thus the need for random access goes down significantly when String is being used properly.
Perhaps it would help if you provide a real world example where String requires random access?
-DW
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-evolution/attachments/20170207/ca29469e/attachment.html>
More information about the swift-evolution
mailing list