<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class=""><br class=""><div><blockquote type="cite" class=""><div class="">On Oct 20, 2016, at 8:41 AM, Erik Eckstein <<a href="mailto:eeckstein@apple.com" class="">eeckstein@apple.com</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div class="" style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;"><div class=""><blockquote type="cite" class="" style="font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px;"><div class=""><div class="" style="font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;"><div class="">To clarify: I proposed an alternate approach in which the @sil_cow reference is only mutable during the Array’s @inout scope—to be automatically enforced by the compiler once @inout scopes are enforced. But the text in question is not referring to that approach, so your comments are on target.</div></div></div></blockquote><div style="font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;" class=""><br class=""></div><div style="font-family: Helvetica; font-size: 12px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;" class="">After thinking about Joe’s suggestion (having the cow attribute on the class type and make a reference to that type move-only), I’m more inclined to go with the isUnique builtin. If such a reference can only be returned by isUnique, it is really guaranteed that only a uniquely referenced buffer can be mutated. With the inout approach, the programmer is not forced to make the uniqueness check before modifying the buffer.</div></div></div></div></blockquote></div><div class=""><br class=""></div><div class="">In my mind, relying on a move-only reference type is exactly what I was advocating for, but relies on a language feature rather than a “special” compiler verification. This all still needs to work with an ‘inout’ Array. The compiler will effectively be doing the same verification that I was proposing but as a side effect of move-only semantics (type system support makes it much easier). The isUnique builtin would just be a mechanism to get the mutable type, and the endUnique builtin is the mechanism to move the type back. As Dave pointed out, we could provide additional mechanisms for mutation that don’t depend on uniqueness. But the SIL optimizer doesn’t need to be explicitly taught about any of those builtin mechanisms for correctness. More importantly, the user is no longer responsible for some easy-to-violate, unverified property of the data type as a whole.</div><div class=""><br class=""></div><div class="">-Andy</div></body></html>