[swift-evolution] [proposal]Decouple definition of Int8 from target char type
Dmitri Gribenko
gribozavr at gmail.com
Fri Feb 26 01:13:27 CST 2016
On Thu, Feb 25, 2016 at 9:58 PM, William Dillon <william at housedillon.com> wrote:
>>> Swift currently maps the Int8 type to be equal to the char type of the target platform. On targets where char is unsigned by default, Int8 becomes an unsigned 8-bit integer, which is a clear violation of the Principle of Least Astonishment. Furthermore, it is impossible to specify a signed 8-bit integer type on platforms with unsigned chars.
>>
>> I'm probably misunderstanding you, but are you sure that's what is
>> happening? I can't imagine how the standard library would just
>> silently make Int8 unsigned on Linux arm.
>>
>
> I think the best way to demonstrate this is through an example. Here is a sample swift program:
>
> import Foundation
> print(NSNumber(char: Int8.min).shortValue)
There is a lot happening in this snippet of code (including importing
two completely different implementations of Foundation, and the pure
swift one not being affected by Clang importer at all). Could you
provide AST dumps for both platforms for this code?
>> What I would expect to happen is that on Linux arm the Clang importer
>> would map 'char' to UInt8, instead of mapping it to Int8 like it does
>> on x86_64.
>
> That would approach a satisfactory solution, except that it would lead to frustration in the long term, and ultimately an expansion of the number of special cases. Any API that relies upon the definition of char would be bifurcated. The user would have to either bracket with #if blocks (and know what platform specifies what), or explicitly cast to a consistent type at every entry point char is used. And, when providing values to C, the reverse is true. The user would have to know what platforms do what, and explicitly cast their internally-used type into the correct type for char first.
>
> By using CChar, the user isn’t required to maintain this knowledge and list of platforms in countless locations in their code. All they would have to do is cast from CChar to whatever type they want to use within Swift. When going the other way, to get a value from Swift to C, they just cast to CChar and the correct action is taken. In cases where the Swift type is the same as CChar the cast is basically a no-op.
>
> Another benefit is that the process brings awareness to the fact that char is not defined consistently across platforms. I believe that it is worth while for people to understand the implications of the code they write, and the cast from CChar to something else provides an opportunity for a moment of reflection and the asking of the question “what am I doing here, and what do I want.”
I agree it makes sense to do what you are proposing, if we agree on
the premise that having 'char' preserve the platform's C idea of
signedness is a good thing. This is what we need to figure out,
whether we want to preserve that difference or erase it completely.
What I'm seeing now is that we are failing to erase the difference.
>>> C type | Swift type
>>> -=-=-=-=-=-=-=-=-=-=-=-=-=-=-
>>> char | CChar
>>> unsigned char | UInt8
>>> signed char | Int8
>>
>> This brings in the notion of the CChar type, and requires us to define
>> (hopefully!) some rules for type-based aliasing, since you want to be
>> able to freely cast UnsafePointer<CChar> to UnsafePointer<UInt8> or
>> UnsafePointer<Int8>.
>>
>
> Swift already has a CChar.
I agree, but it is not set in stone. We can change it, or remove it,
if it would make sense.
> It’s defined in https://github.com/apple/swift/blob/master/stdlib/public/core/CTypes.swift#L19 In the usage in CTypes.swift, the fact that Int8 has this dual meaning is relied upon. I agree that the ability to cast between UnsafePointers specialized to each type is desirable.
>
>> What about a proposal where we would always map 'char' to Int8,
>> regardless of the C's idea of signedness?
>>
>
> In a very real sense this is exactly what is happening currently.
Sorry, I don't see that yet -- it is still unclear to me what is happening.
Dmitri
--
main(i,j){for(i=2;;i++){for(j=2;j<i;j++){if(!(i%j)){j=0;break;}}if
(j){printf("%d\n",i);}}} /*Dmitri Gribenko <gribozavr at gmail.com>*/
More information about the swift-evolution
mailing list