[swift-evolution] [Draft] Target-specific CChar
william at housedillon.com
Mon Mar 7 10:24:01 CST 2016
> On Mar 4, 2016, at 5:52 PM, Michel Fortin <michel.fortin at michelf.ca> wrote:
>> As it stands, Swift’s indifference to the signness of char while importing from C can be ignored in many cases. The consequences of inaction, however, leave the door open for extremely subtle and dificult to diagnose bugs any time a C API relies on the use of values greater than 128 on platforms with unsigned char; in this case the current import model certainly violates the Principle of Least Astonishment.
>> This is not an abstract problem that I want to have solved “just because.” This issue has been a recurrent theme, and has come up several times during code review. I’ve included a sampling of these to provide some context to the discussion:
>> • Swift PR–1103
>> • Swift Foundation PR–265
>> In these discussions we obviously struggle to adequately solve the issues at hand without introducing the changes proposed here. Indeed, this proposal was suggested in Swift Foundation PR–265 by Joe Groff
> I don't want to downplay the issue, but I also wouldn't want the remedy to be worse than the problem it tries to solve.
Believe me when I say that any solution that is worse than the current state would be unacceptable to me.
> One remedy is to have char map to Int8 or UInt8 depending on the platform, but this introduces C portability issues in Swift itself so it's not very good.
> Another remedy is to have an "opaque" CChar type that you have to cast to Int8 or UInt8 manually. While this removes the portability issue, it introduces friction in Swift whenever you need to use a C API that uses char. It also introduces a risk that the manual conversion is done wrong.
I’ve updated the Gist; the second remedy that you mentioned is the current preferred solution. It’s true that the manual conversion could be done wrong, but please remember that the conversion is happening currently, it’s just done for you by Swift. The possibility of inappropriate conversion is still very real, but the user has no control over it.
> Given the distribution of platforms using an unsigned char by default, I do wonder if there are libraries out there that actually depend on that. It seems to me that any C code depending on an unsigned char by default is already at risk of silently producing wrong results just by moving to a different CPU. In most circumstances, that'd be considered a bug in the C code.
That is not an unreasonable position.
> So, my question is: are we contemplating complicating the language and introducing friction for everyone only for a theoretical interoperability problem that would never happen in practice? I would suggest that perhaps the best remedy for this would be to just translate char to Int8 all the time, including on those platforms-architecture combos where it goes against the default C behavior. Just document somewhere that it is so, perhaps offer a flag so you can reverse the importer's behavior in some circumstances, and be done with it.
Again, my intention is to not introduce undue friction. I’ve ported the standard library (and some of Foundation) for each of the candidate solutions, and the changes are extremely minor. In fact, in the vast majority of cases, it is enough to simply select the correct type when designing the method signatures and variables.
Thanks for taking the time to share your thoughts.
More information about the swift-evolution