<html><head><style>body{font-family:Helvetica,Arial;font-size:13px}</style></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;"><div><blockquote type="cite" class="clean_bq" style="font-family: Helvetica, Arial; font-size: 13px; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;"><span><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><div>It does violate the principle of least astonishment, but we should acknowledge that the implementation-specific nature of C's char signedness is making code *less* portable, not more -- because the same code can mean different things on different platforms. Reflecting the same in Swift makes Swift code less portable, too.</div><div><br></div><div>Dmitri</div></div></div></div></span></blockquote></div><p>That is a fair point, and I agree for the most part. However, It is my intent and expectation that the use of CChar would be limited to the margins where C APIs are imported. Once values become a part of Swift (and used in places outside of the C interface) they should have been cast into a pure Swift type (such as UInt8, Int8, Int, etc).</p><p>- Will</p><div><blockquote type="cite" class="clean_bq" style="font-family: Helvetica, Arial; font-size: 13px; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;"><span><div dir="ltr"><div class="gmail_extra"></div></div></span></blockquote></div></body></html>