[swift-users] StringLiteralConvertible protocol question
Loïc Lecrenier
loiclecrenier at icloud.com
Mon Jan 11 08:08:57 CST 2016
> Yes. Because StringLiteralConvertible inherits from ExtendedGraphemeClusterLiteralConvertible, ExtendedGraphemeClusterLiteralConvertible inherits from UnicodeScalarLiteralConvertible.
Yes, I know. But I wonder why that is, considering that something that is StringLiteralConvertible will seemingly never use the required initializer from UnicodeScalarLiteralConvertible.
> Is there a way to write something that is a unicode scalar literal, but not a string literal?
> Yes. You have already done it by extension UnicodeScalarLiteralConvertible only. String literal is what people read. Unicode is something string encoding to store in computer and the computer read.
>
> for example:
>
> let uScalar = "a".unicodeScalars.first! // 97
> print(uScalar.dynamicType) // UnicodeScalar. NOT an Int
Sorry, I am not sure what this answers :(
I wanted to know if there was a way to write something like (for example) uscalar”\u{65}”, that would only be a unicode scalar literal, and not a string literal. Or if there was any other way to tell the compiler “please use the initializer from UnicodeScalarLiteralConvertible instead of the one from StringLiteralConvertible”.
I guess the more fundamental question I wanted to ask was “why must StringLiteralConvertible conform to UnicodeScalarLiteralConvertible?”
Thanks,
Loïc
>
> On Mon, Jan 11, 2016 at 4:54 AM, Loïc Lecrenier <swift-users at swift.org> wrote:
> Hi :)
>
> I have been trying to understand the StringLiteralConvertible protocol, but there is something that I still can’t explain.
>
> //-----------------------------
>
> extension Int : UnicodeScalarLiteralConvertible {
> public init(unicodeScalarLiteral value: UnicodeScalar) {
> self = 1
> }
> }
>
> extension Int : ExtendedGraphemeClusterLiteralConvertible {
> public init(extendedGraphemeClusterLiteral value: Character) {
> self = 2
> }
> }
>
> extension Int : StringLiteralConvertible {
> public init(stringLiteral value: String) {
> self = 3
> }
> }
>
> let a : Int = "\u{65}" // e
> let b : Int = "\u{65}\u{0301}" // é
> let c : Int = “hello"
>
> //-----------------------------
>
> If I only write the first extension: I can only initialize a, and its value will be 1.
> If I write the first two extensions: I can initialize a and b, and their values will be 2.
> And if I keep the three extensions: a, b, and c will all have a value of 3.
>
> So it seems like the compiler prefers calling the initializer from (in order of preference):
> 1. StringLiteralConvertible
> 2. ExtendedGraphemeClusterLiteralConvertible
> 3. UnicodeScalarLiteralConvertible
>
> But for something to be StringLiteralConvertible, it needs to be ExtendedGraphemeClusterLiteralConvertible and UnicodeScalarLiteralConvertible, which means I have to define two initializers that will never be called.
>
> Is that correct?
> Is there a way to write something that is a unicode scalar literal, but not a string literal?
>
> Thank you,
>
> Loïc
>
>
>
>
>
>
> _______________________________________________
> swift-users mailing list
> swift-users at swift.org
> https://lists.swift.org/mailman/listinfo/swift-users
>
>
>
> --
>
> Owen Zhao
More information about the swift-users
mailing list