[swift-evolution] [Draft] Target-specific CChar

Joe Groff jgroff at apple.com
Wed Mar 2 13:27:34 CST 2016


> On Mar 2, 2016, at 11:14 AM, Dmitri Gribenko <gribozavr at gmail.com> wrote:
> 
> On Wed, Mar 2, 2016 at 11:13 AM, Dmitri Gribenko <gribozavr at gmail.com <mailto:gribozavr at gmail.com>> wrote:
>> On Wed, Mar 2, 2016 at 11:10 AM, Joe Groff <jgroff at apple.com> wrote:
>>> 
>>>> On Mar 2, 2016, at 11:06 AM, Dmitri Gribenko via swift-evolution <swift-evolution at swift.org> wrote:
>>>> 
>>>> On Wed, Mar 2, 2016 at 11:03 AM, William Dillon <william at housedillon.com> wrote:
>>>>> It does violate the principle of least astonishment, but we should
>>>>> acknowledge that the implementation-specific nature of C's char signedness
>>>>> is making code *less* portable, not more -- because the same code can mean
>>>>> different things on different platforms.  Reflecting the same in Swift makes
>>>>> Swift code less portable, too.
>>>>> 
>>>>> Dmitri
>>>>> 
>>>>> That is a fair point, and I agree for the most part.  However, It is my
>>>>> intent and expectation that the use of CChar would be limited to the margins
>>>>> where C APIs are imported.  Once values become a part of Swift (and used in
>>>>> places outside of the C interface) they should have been cast into a pure
>>>>> Swift type (such as UInt8, Int8, Int, etc).
>>>> 
>>>> True, but how can you cast a CChar portably into UInt8 or Int8?  Only
>>>> via the bitPattern initializer, because the regular initializer will
>>>> trap on values outside of the 0..<128 range on signed platforms or
>>>> unsigned platforms.
>>> 
>>> Could we treat CChar as a "signless" byte type, so that UInt8(cchar) and Int8(cchar) both just reinterpret the bit pattern?
>> 
>> That is viable, but it opens a whole another can of worms:
>> 
>> - we need CChar to be a separate type,
>> 
>> - other unlabelled integer initializers trap when changing the numeric
>> value, and this one would be wildly inconsistent.
> 
> ... but if we don't provide any arithmetic operations on the CChar
> type, and treat it as an opaque byte-sized character type that you can
> only convert to UInt8/Int8, it might not be bad!

Yeah, it would make sense to me if CChar were opaque and you had to construct an Int8 or UInt8 from it to specify the arithmetic semantics you want.

-Joe
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.swift.org/pipermail/swift-evolution/attachments/20160302/8f109366/attachment.html>


More information about the swift-evolution mailing list