[swift-evolution] Default Generic Arguments

Srđan Rašić srdan.rasic at gmail.com
Thu Jan 26 07:55:02 CST 2017


Thanks for your questions Xiaodi, I might have missed some scenarios.
I've had some rethinking and here are the conclusions.
It's a slight refinement of the idea I had in the previous mail. Bear with
me :)

If we have

    func process<T>(_ t: T) {}

    process(5)

 compiler will infer T as Int.

 Now we introduce a default argument

    func process<T = Int64>(_ t: T) {}

 and in order to keep source compatibility, if we do

    process(5)

 we must again infer T as Int. That means that the inference should have
 priority over default arguments. That is in accordance with the rule that
 we defined earlier: if you can’t infer a particular type, fill in a
default.
 We are able to infer particular type, Int, so we do that.

 However, say we had

    struct Storage<T>: Integer {
        init(_ t: T)
    }

    let s = Storage(5)

 and we wanted to introduce a default argument:

    struct Storage<T = Int64>: Integer {
        init(_ t: T)
    }

 What happens with `s`? This is essentially the same problem as previous,
 so the solution should also be same - we must infer Int.

 Would that be confusing for a developer? Maybe, but what is the
 alternative? Using the default would make no sense because then

    let s = Storage("ops")

 would fail to compile. So inference in such cases is a must. Similar
 problem would be observed with inheritance and/or protocol adoption.

    protocol P {}
    struct R: P {}

    struct Storage<T = P>: Integer {
        init(_ t: T)
    }

    let s = Storage(R())

 Is T infered to R or to P? To keep source compatibility, we must infer R.

 In other words, I agree with you Xiaodi.

 Now to my argument about type declarations. Say we now do

    let s: Storage = Storage(R())

 In that case T must be infered to P because type declaration must not be
 affected by inference. Storage on the left must be treated as Storage<P>.
 If that were not the case, consider what would happen if you upgrade local
 variable to a propery.

    class T {
        let s: Storage

        init() {
            s = Storage(R())
        }
    }

 What is infered for T must not change by making such upgrade and allowing
 type inferrence in initializer to affect propery type would make no sense.

 Thus, there has to be a rule that says:

    (I) In type declarations, no inference happens. By omitting generic
arguments
    one accepts the defaults.

 And to repeat our second rule:

    (II) When instantiating generic type or calling generic function, by
omitting
    generic arguments one lets the compiler specify generic arguments by
following
    the principle: infer particular type if possible, fill in a default
otherwise.

 Let's go throught some more examples.

 Declaring a function like

    func clear(_ storage: Storage)

 assumes `storage` to be of type Storage<P> because of rule (I).

 Declaring a constant like

    let s = Storage(R())

 will infer `s` to Storage<R> becasue of (II), but

    let s: Storage = Storage(R())

 would be considered identical to

    let s: Storage<P> = Storage(R())

 because T is defaulted to P and rule (I) is applied to the left hand side,
so
 rule (II) applies to right hand side by the infering type from left hand
side.

 We must also consider:

    let s = Storage("123")

 This is simple. With rule (II) we infer Storage<String>. However, if we do

    let s: Storage = Storage("123")

 compiler must throw an error: cannot assign Storage<String> to Storage<P>.

 Next, consider following type

    struct Storage<T = Int64>: Integer {
        init()
    }

 If we do

    let s = Storage()

 we should apply rule (II). In this case, no particular type can be infered
so
 we will fill in the default. Meaning that `s` would be Storage<Int64>.

 What about generic function calls? Let's use the older example.

    protocol P {}
    struct R: P {}

    struct Storage<T = P>: Integer {
        init(_ t: T)
    }

    func clear<T>(_ storage: Storage<T>)

 If we do

    clear(Storage())

 we should use rule (II) to get the type. Here we don't have anything to
infer
 from so we should fill in the default. T = Storage<P>.

 Doing

    clear(Storage(R()))

 would use rule (II) to infer T as R.

 However, consider generic function with defaults:

    func clear<T = R>(_ storage: Storage<T>)

 Now we have a defult R in function and a default P in the type. What if we
now do

    clear(Storage())

 Should T be specialized to P or R? This is a conflict of defaults. I'd say
it should be
 resolved in the favour of function. In a way function that operates on
type X could be
 considered extension of that type and has more specific use case knowledge
of its
 arguments so the function preference should be accepted.

 So, trying to apply rule (II). There is nothing to infer, so we try to
fill in a default.
 We have two defaults - function says the default is R, struct says the
default is P.
 As we are resolving this in favour of the function - we chose R.

 Final example,

    clear(Storage(R()))

 There are no conflicts here. By applying rule (II) we directly infer type
R and use it
 regardless of the defaults.


Sorry for the long email, but hopefully it's now more understandable.
Looking forward to your feedback.



On Thu, Jan 26, 2017 at 2:15 AM, Xiaodi Wu <xiaodi.wu at gmail.com> wrote:

> Srdan, I'm afraid I don't understand your discussion. Can you simplify it
> for me by explaining your proposed solution in terms of Alexis's examples
> below?
>
> ```
> // Example 1: user supplied default is IntegerLiteralConvertible
>
> func foo<T=Int64>(t: T) { ... }
>
> foo(22)
> //  ^
> //  |
> //  What type gets inferred here?
> ```
>
> I believe that it is essential that the answer here be `Int` and not
> `Int64`.
>
> My reasoning is: a user's code *must not* change because a library *adds*
> a default in a newer version. (As mentioned in several design docs, most
> recently the new ABI manifesto, defaults in Swift are safe to add without
> breaking source compatibility.)
>
> Here, if version 1 of a library has `func foo<T>(t: T) { ... }`, then
> `foo(22)` must infer `T` to be `Int`. That's just the rule in Swift, and it
> would be severely source-breaking to change that. Therefore, if version 2
> of that library has `func foo<T=Int64>(t: T) { ... }`, then `foo(22)` must
> still infer `T` to be `Int`.
>
> Does your proposed solution have the same effect?
>
> ```
> // Example 2: user supplied default isn't IntegerLiteralConvertible
>
> func bar<T=Character>(t: T) { ... }
>
> bar(22)
> //  ^
> //  |
> //  What type gets inferred here?
> ```
>
> By the same reasoning as above, this ought to be `Int`. What would the
> answer be in your proposed solution?
>
>
> On Wed, Jan 25, 2017 at 2:07 PM, Srđan Rašić <srdan.rasic at gmail.com>
> wrote:
>
>> That's a good example Alexis. I do agree that generic arguments are
>> inferred in a lot of cases, my point was that they should not be inferred
>> in "type declarations". Not sure what's the right terminology here, but I
>> mean following places:
>>
>> (I) Variable/Constant declaration
>>
>>   ```
>>   let x: X
>>   ```
>>
>> (II) Property declaration
>>
>>   ```
>>   struct T {
>>     let x: X
>>   }
>>   ```
>>
>> (III) Function declaration
>>
>>   ```
>>   func a(x: X) -> X
>>   ```
>>
>> (IV) Enumeration case declaration
>>
>>   ```
>>   enum E {
>>     case x(X)
>>   }
>>   ```
>>
>> (V) Where clauses
>>
>>   ```
>>   extensions E where A == X {}
>>   ```
>>
>> In those cases `X` should always mean `X<Int>` if it was defined as
>> `struct X<T = Int>`. That's all my rule says. Sorry for not being clear in
>> the last email :)
>>
>> As for the other cases, mostly those where an instance is created,
>> inference should be applied.
>>
>> Let's go through your examples. Given
>>
>> struct BigInt: Integer {
>>   var storage: Array<Int> = []
>> }
>>
>> func process<T: BinaryInteger>(_ input: BigInt<T>) -> BigInt<T> { ... }
>>
>> what happens with `let val1 = process(BigInt())`? I think this is
>> actually the same problem as what happens in case of `let x = BigInt()`.
>>
>> In such case my rule does not apply as we don't have full type
>> declaration. In `let x = BigInt()` type is not defined at all, while in `func
>> process<T: BinaryInteger>(_ input: BigInt<T>) -> BigInt<T> { ... }` type
>> is explicitly weakened or "undefaulted" if you will.
>>
>> We should introduce new rule for such cases and allowing `Storage=Int`
>> default to participate in such expressions would make sense. As you said,
>> it also solves second example: let val2 = process(0).
>>
>> I guess this would be the problem we thought we were solving initially
>> and in that case I think the solution should be what Doug suggested: if
>> you can’t infer a particular type, fill in a default.
>>
>> Of course, if the default conflicts with the generic constraint, it would
>> not be filled in and it would throw an error.
>>
>> For the sake of completeness,
>>
>> func fastProcess(_ input: BigInt<Int64>) -> BigInt<Int64> { ... }
>> let val3 = fastProcess(BigInt())
>>
>> would certainly infer the type from context as my rule does not apply to
>> initializers. It would infer BigInt<Int64>.
>>
>> As for your last example, I guess we can't do anything about that and
>> that's ok.
>>
>>
>> On Wed, Jan 25, 2017 at 7:50 PM, Alexis <abeingessner at apple.com> wrote:
>>
>>> Yes, I agree with Xiaodi here. I don’t think this particular example is
>>> particularly compelling. Especially because it’s not following the full
>>> evolution of the APIs and usage, which is critical for understanding how
>>> defaults should work.
>>>
>>>
>>> Let's look at the evolution of an API and its consumers with the example
>>> of a BigInt:
>>>
>>>
>>> struct BigInt: Integer {
>>>   var storage: Array<Int> = []
>>> }
>>>
>>>
>>> which a consumer is using like:
>>>
>>>
>>> func process(_ input: BigInt) -> BigInt { ... }
>>> let val1 = process(BigInt())
>>> let val2 = process(0)
>>>
>>>
>>> Ok that's all fairly straightforward. Now we decide that BigInt should
>>> expose its storage type for power-users:
>>>
>>>
>>> struct BigInt<Storage: BinaryInteger = Int>: Integer {
>>>   var storage: Array<Storage> = []
>>> }
>>>
>>>
>>> Let's make sure our consumer still works:
>>>
>>>
>>> func process(_ input: BigInt) -> BigInt { ... }
>>> let val1 = process(BigInt())
>>> let val2 = process(0)
>>>
>>>
>>> Ok BigInt in process’s definition now means BigInt<Int>, so this still
>>> all works fine. Perfect!
>>>
>>>
>>> But then the developer of the process function catches wind of this new
>>> power user feature, and wants to support it.
>>> So they too become generic:
>>>
>>>
>>> func process<T: BinaryInteger>(_ input: BigInt<T>) -> BigInt<T> { ... }
>>>
>>>
>>> The usage sites are now more complicated, and whether they should
>>> compile is unclear:
>>>
>>>
>>> let val1 = process(BigInt())
>>> let val2 = process(0)
>>>
>>>
>>> For val1 you can take a hard stance with your rule: BigInt() means
>>> BigInt<Int>(), and that will work. But for val2 this rule doesn't work,
>>> because no one has written BigInt unqualified. However if you say that the
>>> `Storage=Int` default is allowed to participate in this expression, then we
>>> can still find the old behaviour by defaulting to it when we discover
>>> Storage is ambiguous.
>>>
>>> We can also consider another power-user function:
>>>
>>>
>>> func fastProcess(_ input: BigInt<Int64>) -> BigInt<Int64> { ... }
>>> let val3 = fastProcess(BigInt())
>>>
>>>
>>> Again, we must decide the interpretation of this. If we take the
>>> interpretation that BigInt() has an inferred type, then the type checker
>>> should discover that BigInt<Int64> is the correct result. If however we
>>> take stance that BigInt() means BigInt<Int>(), then we'll get a type
>>> checking error which our users will consider ridiculous: *of course* they
>>> wanted a BigInt<Int64> here!
>>>
>>> We do however have the problem that this won’t work:
>>>
>>>
>>> let temp = BigInt()
>>> fastProcess(temp) // ERROR — expected BigInt<Int64>, found BigInt<Int>
>>>
>>>
>>> But that’s just as true for normal ints:
>>>
>>>
>>> let temp = 0
>>> takesAnInt64(temp) // ERROR — expected Int64, found Int
>>>
>>>
>>> Such is the limit of Swift’s inference scheme.
>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-evolution/attachments/20170126/f5ef7c3a/attachment.html>


More information about the swift-evolution mailing list