[swift-dev] State of String: ABI & Performance

Ben Cohen ben_cohen at apple.com
Thu Jan 11 23:23:31 CST 2018



> On Jan 11, 2018, at 17:34, Michael Ilseman <milseman at apple.com> wrote:
> 
> 
> 
>> On Jan 11, 2018, at 4:20 PM, Ben Cohen <ben_cohen at apple.com <mailto:ben_cohen at apple.com>> wrote:
>> 
>> 
>> 
>>> On Jan 11, 2018, at 12:32 PM, Michael Ilseman via swift-dev <swift-dev at swift.org <mailto:swift-dev at swift.org>> wrote:
>>> 
>>> For a more general solution, I think a `var numericValue: Int? { get }` on Character would make sense. Unicode defines (at least one) semantics for this and ICU provides this functionality already.
>>> 
>> 
>> Minor style point – this should be a failable init on Int rather than a computed property on Character
>> 
>> i.e. Int.init?(_: Character), matching Int.init?(_: String, radix: Int = 10), only it doesn’t need the radix arg cos it’s only a single character.
>> 
> 
> `Int.init?` is probably better, yes. If you wanted to take that to its logical conclusion, that would include `Double.init?` for mathematical constants and some `Rational.init` for fraction graphemes. These are pretty far off the deep end ;-)
> 
> Minor logic point – radix isn't important, but that's not because it’s only a single character but rather because of how Unicode(/ICU) defines character properties. Numbers can be much higher than 9, e.g. 万 is ten-thousand. Even worse is cuneiform, which is sexagesimal (and of course cuneiform is properly encoded in Unicode!). Luckily, Unicode’s numeric values are always presented in base-10, AFAICT.
> 

Seems like two different features got tangled here, both of which are useful…

On the one hand, Chris’ case of “I’ve got a Character and I want to convert ASCII 0-9a-f hex into an Int and it’s incredibly painful in Swift 4”. This is a pretty common everyday need and calls for a failable initializer on Int, similar to the one that takes a String, that is very fast and only covers those characters. Thinking about it, you probably do want an optional radix argument, because you want to explicitly control whether “a” is nil or 10.

Separate different feature is exposing Unicode properties of characters, including things like 5️⃣ = 5.0, ½ = 0.5 etc, but where a is nil. This possibly works better as a property like numericValue: Double? than an init on Double.

> 
>> If implemented inside the std lib, it can still access character’s internals, which is a reasonable thing to do for something so performance-sensitive.
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-dev/attachments/20180111/5c335d19/attachment.html>


More information about the swift-dev mailing list