[swift-evolution] [Draft] Target-specific CChar

William Dillon william at housedillon.com
Wed Mar 2 13:22:51 CST 2016


On March 2, 2016 at 11:06:55 AM, Dmitri Gribenko (gribozavr at gmail.com) wrote:
On Wed, Mar 2, 2016 at 11:03 AM, William Dillon <william at housedillon.com> wrote:  
> It does violate the principle of least astonishment, but we should  
> acknowledge that the implementation-specific nature of C's char signedness  
> is making code *less* portable, not more -- because the same code can mean  
> different things on different platforms. Reflecting the same in Swift makes  
> Swift code less portable, too.  
>  
> Dmitri  
>  
> That is a fair point, and I agree for the most part. However, It is my  
> intent and expectation that the use of CChar would be limited to the margins  
> where C APIs are imported. Once values become a part of Swift (and used in  
> places outside of the C interface) they should have been cast into a pure  
> Swift type (such as UInt8, Int8, Int, etc).  

True, but how can you cast a CChar portably into UInt8 or Int8? Only  
via the bitPattern initializer, because the regular initializer will  
trap on values outside of the 0..<128 range on signed platforms or  
unsigned platforms.  

Dmitri  


Yes, that’s true.  And I think that really gets to the crux of the issue.  Right now, Swift is doing that (bitPattern) for you, and you don’t have a say in the matter.  However, when you transition from CChar to either Int8 or UInt8 you probably have an awareness of the implications of your actions.  You should know whether the important quality is the bit pattern or the binary representation of a numeric value.  If it’s the latter, a trap on overflow might by an excellent diagnostic of failed assumptions, if it’s the former use bitPattern.

- Will
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-evolution/attachments/20160302/8ff24d45/attachment.html>


More information about the swift-evolution mailing list