<html><head><style>body{font-family:Helvetica,Arial;font-size:13px}</style></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;"><div id="bloop_customfont" style="font-family:Helvetica,Arial;font-size:13px; color: rgba(0,0,0,1.0); margin: 0px; line-height: auto;"><br></div> <br> <div id="bloop_sign_1456946727095721984" class="bloop_sign"><div style="font-family:helvetica,arial;font-size:13px"><br></div></div> <br><p class="airmail_on">On March 2, 2016 at 11:14:26 AM, Dmitri Gribenko (<a href="mailto:gribozavr@gmail.com">gribozavr@gmail.com</a>) wrote:</p> <blockquote type="cite" class="clean_bq"><span><div><div></div><div>On Wed, Mar 2, 2016 at 11:13 AM, Dmitri Gribenko <gribozavr@gmail.com> wrote:
<br>> On Wed, Mar 2, 2016 at 11:10 AM, Joe Groff <jgroff@apple.com> wrote:
<br>>>
<br>>>> On Mar 2, 2016, at 11:06 AM, Dmitri Gribenko via swift-evolution <swift-evolution@swift.org> wrote:
<br>>>>
<br>>>> On Wed, Mar 2, 2016 at 11:03 AM, William Dillon <william@housedillon.com> wrote:
<br>>>>> It does violate the principle of least astonishment, but we should
<br>>>>> acknowledge that the implementation-specific nature of C's char signedness
<br>>>>> is making code *less* portable, not more -- because the same code can mean
<br>>>>> different things on different platforms. Reflecting the same in Swift makes
<br>>>>> Swift code less portable, too.
<br>>>>>
<br>>>>> Dmitri
<br>>>>>
<br>>>>> That is a fair point, and I agree for the most part. However, It is my
<br>>>>> intent and expectation that the use of CChar would be limited to the margins
<br>>>>> where C APIs are imported. Once values become a part of Swift (and used in
<br>>>>> places outside of the C interface) they should have been cast into a pure
<br>>>>> Swift type (such as UInt8, Int8, Int, etc).
<br>>>>
<br>>>> True, but how can you cast a CChar portably into UInt8 or Int8? Only
<br>>>> via the bitPattern initializer, because the regular initializer will
<br>>>> trap on values outside of the 0..<128 range on signed platforms or
<br>>>> unsigned platforms.
<br>>>
<br>>> Could we treat CChar as a "signless" byte type, so that UInt8(cchar) and Int8(cchar) both just reinterpret the bit pattern?
<br>>
<br>> That is viable, but it opens a whole another can of worms:
<br>>
<br>> - we need CChar to be a separate type,
<br>>
<br>> - other unlabelled integer initializers trap when changing the numeric
<br>> value, and this one would be wildly inconsistent.
<br>
<br>... but if we don't provide any arithmetic operations on the CChar
<br>type, and treat it as an opaque byte-sized character type that you can
<br>only convert to UInt8/Int8, it might not be bad!
<br>
<br>Dmitri
<br><br></div></div></span></blockquote><br><div>I, to, like this solution. It would be moderately more work to transition to, but I think it’s better overall.</div><div><br></div><div>- Will</div><div><br></div></body></html>