<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">So far, we've discussed two ways of interpreting `self = nil`, both of which have a sensible solution, in my opinion:<div class=""><br class=""></div><div class="">1. It's a special rule like you said, which can be seen as counter-intuitive, but recall that `return nil` is just as much of a special rule and is also largely counter-intuitive. The benefit of `self = nil` is that it's much more in line with initialization semantics, it provides more uniform syntax and it's a bit less restrictive.<br class=""><div><br class=""></div><div>2. It's an `inout Self!`, like Greg said, which can be seen as more cumbersome. Implicitly unwrapped optionals are a bit difficult, but this "variation" of it is much more restrictive then the normal ones, because unlike normal implicitly unwrapped optionals, this one cannot be accessed after being assigned nil (and it also cannot be indirectly assigned `nil`, because escaping `self` is not allowed before full initialization), so there is only one possible place it can be set to nil and that's directly in the initializer. This means that `self` can be safely treated as `inout Self` before being set to nil (and after being set to nil, it doesn't matter any more because you aren't allowed to access it, due to not being fully initialized).</div><div><br class=""></div><div>Overall, I'd go with #2 because it involves much less confusing magic and the restrictions of `self as inout Self!` are imposed by already existing and well-understood initialization logic, so the provided guarantees don't really come at the cost of much clarity.</div><div><br class=""><blockquote type="cite" class=""><div class="">On Jun 9, 2017, at 2:23 PM, Xiaodi Wu <<a href="mailto:xiaodi.wu@gmail.com" class="">xiaodi.wu@gmail.com</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div class=""><br class=""><div class="gmail_quote"><div class="">On Fri, Jun 9, 2017 at 07:12 Gor Gyolchanyan <<a href="mailto:gor@gyolchanyan.com" class="">gor@gyolchanyan.com</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="word-wrap:break-word;line-break:after-white-space" class="">I think a good approach would be to have `self = nil` only mean `the initializer is going to fail` because if your type is ExpressibleByNilLiteral, it means that the `nil` of your type already carries the same meaning as if your type was not ExpressibleByNilLiteral and was an optional instead, so having a failable initializer doesn't really make sense in that case (since you could've initialized `self` to its own `nil` in case of failure). Still, some valid use cases may exist, so the natural (and quite intuitive) way to circumvent this would be to call `self.init(nilLiteral: ())` directly.</div><div style="word-wrap:break-word;line-break:after-white-space" class=""><div class=""><div class=""><div class=""></div></div></div></div></blockquote><div class=""><br class=""></div><div class="">So you would create a special rule that `self = nil` means a different thing in an initializer than it does in a function? Essentially, then, you’re creating your own variation on an implicitly unwrapped optional, where `self` is of type `inout Self?` for assignment in initializers only but not for any other purpose. Implicitly unwrapped optionals are hard to reason about, and having a variation on it would be even harder to understand. I don’t think this is a workable design.</div><div class=""><br class=""></div><div class="">It might be possible to have `self` be of type `inout Self?`; however, I do think Greg is right that it would create more boilerplate than the current situation.</div><div class=""><br class=""></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div style="word-wrap:break-word;line-break:after-white-space" class=""><div class=""><div class=""><div class=""><blockquote type="cite" class=""><div class="">On Jun 9, 2017, at 2:07 PM, Xiaodi Wu <<a href="mailto:xiaodi.wu@gmail.com" target="_blank" class="">xiaodi.wu@gmail.com</a>> wrote:</div><br class="m_8163076293838887182Apple-interchange-newline"><div class=""><div style="font-family:DejaVuSans;font-size:14px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px" class=""><br class="m_8163076293838887182Apple-interchange-newline">On Fri, Jun 9, 2017 at 06:56 Gor Gyolchanyan <<a href="mailto:gor@gyolchanyan.com" target="_blank" class="">gor@gyolchanyan.com</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="font-family:DejaVuSans;font-size:14px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-style:solid;border-left-color:rgb(204,204,204);padding-left:1ex"><div style="word-wrap:break-word;line-break:after-white-space" class=""><div class="">The type of `self` could remain `inout Self` inside the failable initializer. The ability to assign nil would be a compiler magic (much like `return nil` is compiler magic) that is meant to introduce uniformity to the initialization logic.</div><br class=""><div class="">The idea is to define all different ways initialization can take place and expand them to be used uniformly on both `self` and all its members, as well as remove the ways that do not make sense for their purpose.</div><div class=""><br class=""></div><div class="">Currently, there are 3 ways of initializing self as a whole:</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">        </span>1. delegating initializer</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">        </span>2. assigning to self</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">        </span>3. returning nil</div><div class=""><div class=""><br class=""></div><div class="">#1: The delegating initializer is pretty much perfect at this point, in my opinion, so no changes there.</div><div class=""><br class=""></div><div class="">#2: The only exception in assigning to self is the `nil` inside failable initializers.</div><div class=""><br class=""></div><div class="">#3: The only thing that can be returned from an initializer is `nil`, which is compiler magic, so we can thing of it as a misnomer (because we aren't really **returning** anything).</div><div class=""><br class=""></div><div class="">If, for a second, we forget about potential factory initializers, returning anything from an initializer doesn't make much sense, because an initializer is conceptually meant to bring an existing object in memory to a type-specific valid state. This semantic was very explicitly in Objective-C with `[[MyType alloc] init]`. Especially since even syntactically, the initializer does not specify any return type, the idea of returning from an initializer is counter-intuitive both syntactically and semantically.</div><div class=""><br class=""></div><div class="">The actual *behavior* of `return nil` is very sensible, so the behavior, I imagine `self = nil`, would largely mean the same (except not needed to return immediately and allowing non-self-accessing code to be executed before return). Being able to assign `nil` to a non-optional (ExpressibleByNilLiteral doesn't count) may feel a bit wonky,</div></div></div></blockquote><div style="font-family:DejaVuSans;font-size:14px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px" class=""><br class=""></div><div style="font-family:DejaVuSans;font-size:14px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px" class="">What happens when Self is ExpressibleByNilLiteral and you want to initialize self to nil? That is what `self = nil` means if `self` is of type `inout Self`. If `self` is of type `inout Self` and Self is not ExpressibleByNilLiteral, then it must be an error to assign nil to self. Anything else does not make sense, unless `self` is of type `inout Self?`.</div><div style="font-family:DejaVuSans;font-size:14px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px" class=""><br class=""></div><blockquote class="gmail_quote" style="font-family:DejaVuSans;font-size:14px;font-style:normal;font-variant-caps:normal;font-weight:normal;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-style:solid;border-left-color:rgb(204,204,204);padding-left:1ex"><div style="word-wrap:break-word;line-break:after-white-space" class=""><div class=""><div class="">but not as wonky as returning nil from something that is meant to initialize an object in-place and doesn't look like it should return anything.</div><div class=""><br class=""></div><div class=""># Factory Initializers</div><div class=""><br class=""></div><div class="">In case of factory initializers, the much discussed `factory init` syntax could completely flip this logic, but making the initializer essentially a static function that returns an object. In this case the initializer could be made to specify the return type (that is the supertype of all possible factory-created objects) and assigning to self would be forbidden because there is not self yet:</div><div class=""><br class=""></div><div class="">extension MyProtocol {</div><div class=""><br class=""></div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">        </span>public factory init(weCool: Bool) -> MyProtocol {</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                </span>self = MyImpl() // error: cannot assign to `self` in a factory initializer</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                </span>self.init(...) // error: cannot make a delegating initializer call in a factory initializer</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                </span>if weCool {</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                        </span>return MyCoolImpl()</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                </span>} else {</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                        </span>return MyUncoolImpl()</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">                </span>}</div><div class=""><span class="m_8163076293838887182m_-8492261585337030922Apple-tab-span" style="white-space:pre-wrap">        </span>}</div><div class=""><br class=""></div><div class="">}</div><div class=""><br class=""></div><div class=""># In-place Member Initializers</div><div class=""><br class=""></div><div class="">In addition, member initialization currently is only possible with #2 (as in `self.member = value`), which could be extended in a non-factory initializer to be initializable in-place like this:</div><div class=""><br class=""></div><div class="">self.member.init(...)</div><div class=""><br class=""></div><div class="">This would compliment the delegating initialization syntax, while giving a more reliable performance guarantee that this member will not be copy-initialized.</div></div></div><div style="word-wrap:break-word;line-break:after-white-space" class=""><div class=""><div class=""><br class=""><blockquote type="cite" class=""><div class="">On Jun 9, 2017, at 1:32 PM, Xiaodi Wu <<a href="mailto:xiaodi.wu@gmail.com" target="_blank" class="">xiaodi.wu@gmail.com</a>> wrote:</div><br class="m_8163076293838887182m_-8492261585337030922Apple-interchange-newline"><div class="">If `self` is not of type `inout Self?`, then what is the type of `self` such that you may assign it a value of `nil`?<br class=""><br class="">It certainly cannot be of type `inout Self`, unless `Self` conforms to `ExpressibleByNilLiteral`, in which case you are able to assign `self = nil` an unlimited number of times–but that has a totally different meaning.<br class=""><br class="">Could `self` be of type `inout Self!`? Now that implicitly unwrapped optionals are no longer their own type, I’m not sure that’s possible. But even if it were, that seems unintuitive and potentially error-prone.<br class=""><br class="">So I think Greg is quite right that, to enable this feature, `self` would have to be of type `inout Self?`–which is intriguing but potentially more boilerplatey than the status quo.<br class=""><div class="gmail_quote"><div class="">On Fri, Jun 9, 2017 at 05:24 Gor Gyolchanyan via swift-evolution <<a href="mailto:swift-evolution@swift.org" target="_blank" class="">swift-evolution@swift.org</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-style:solid;border-left-color:rgb(204,204,204);padding-left:1ex"><div style="word-wrap:break-word;line-break:after-white-space" class="">Good point, but not necessarily.<br class=""><div class="">Since you cannot access `self` before it being fully initialized and since `self` can only be initialized once, this would mean that after `self = nil`, you won't be allowed to access `self` in your initializer at all.You'll be able to do any potential, cleanup though.</div><div class="">Also, since there can be only one `self = nil`, there's no reason to treat `self` as `inout Self?`, because the only place it can be `nil` is the place it cannot be accessed any more.</div></div><div style="word-wrap:break-word;line-break:after-white-space" class=""><div class=""><br class=""><div class=""><br class=""><blockquote type="cite" class=""><div class="">On Jun 9, 2017, at 7:45 AM, Greg Parker <<a href="mailto:gparker@apple.com" target="_blank" class="">gparker@apple.com</a>> wrote:</div><br class="m_8163076293838887182m_-8492261585337030922m_1716065582357142928Apple-interchange-newline"><div class=""><div style="word-wrap:break-word;line-break:after-white-space" class=""><br class=""><div class=""><blockquote type="cite" class=""><div class="">On Jun 8, 2017, at 5:09 AM, Gor Gyolchanyan via swift-evolution <<a href="mailto:swift-evolution@swift.org" target="_blank" class="">swift-evolution@swift.org</a>> wrote:</div><div class=""><div class=""><br class="">1. Arbitrary `self` Assignments In Intializers<br class=""><br class="">The first ideas is to allow `self = nil` inside failable initializers (essentially making `self` look like `inout Self?` instead of `inout Self` with magical `return nil`), so that all initializers uniformly can be written in `self = ...` form for clarity and convenience purposes. This should, theoretically, be nothing but a `defer { return nil }` type of rewrite, so I don't see any major difficulties implementing this. This is especially useful for failable-initializing enums where the main switch simply assigns to self in all cases and the rest of the initializer does some post-processing.<br class=""></div></div></blockquote><div class=""><br class=""></div><div class="">I don't see how to avoid source incompatibility and uglification of failable initializer implementations here. Allowing `self = nil` inside a failable initializer would require `self` to be an optional. That in turn would require every use of `self` in the initializer to be nil-checked or forced. I don't think that loss everywhere outweighs the gain of `self = nil` in some places.</div></div><br class=""><div class=""><br class=""></div><div class="">-- </div><div class="">Greg Parker <span class="m_8163076293838887182Apple-converted-space"> </span><a href="mailto:gparker@apple.com" target="_blank" class="">gparker@apple.com</a> Runtime Wrangler</div><div class=""><br class=""></div><div class=""><br class=""></div></div></div></blockquote></div><br class=""></div></div>_______________________________________________<br class="">swift-evolution mailing list<br class=""><a href="mailto:swift-evolution@swift.org" target="_blank" class="">swift-evolution@swift.org</a><br class=""><a href="https://lists.swift.org/mailman/listinfo/swift-evolution" rel="noreferrer" target="_blank" class="">https://lists.swift.org/mailman/listinfo/swift-evolution</a></blockquote></div></div></blockquote></div></div></div></blockquote></div></blockquote></div><br class=""></div></div></div></blockquote></div></div>
</div></blockquote></div><br class=""></div></body></html>