[swift-evolution] [Pitch] Ban the top value in Int/UInt

Jeremy Pereira jeremy.j.pereira at googlemail.com
Thu Oct 20 04:45:37 CDT 2016


> On 19 Oct 2016, at 16:13, Guoye Zhang <cc941201 at me.com> wrote:
> 
> 
>> 在 2016年10月19日,07:10,Jeremy Pereira <jeremy.j.pereira at googlemail.com> 写道:
>> 
>> 
>>> On 18 Oct 2016, at 19:17, Guoye Zhang via swift-evolution <swift-evolution at swift.org> wrote:
>>> 
>>> Currently, Swift Int family and UInt family have compact representations that utilize all available values, which is inherited from C. However, it is horribly inefficient to implement optional integers. It takes double the space to store [Int?] than to store [Int] because of alignment.
>> 
>> Is this a general problem with Swift? Are lots of people complaining that they are running out of space for their Optional<Int> arrays?
>> 
> It's just that a common data type wasting almost half the space seems inefficient. I guess this is also the reason why they didn't adopt optional integers widely in stdlib.

Int? is an enum wrapping an integer, why wouldn’t you expect it to be bigger than an Int? I honestly don’t get why this is suddenly a huge problem. If you are working in a situation where you need an in memory array of ~billion Ints, I agree it becomes an issue but there’s nothing stopping you from implementing the convention manually for that one application. 


>> 
>>> 
>>> I propose to ban the top value in Int/UInt which is 0xFFFF... in hex. Int family would lose its smallest value, and UInt family would lose its largest value. Top value is reserved for nil in optionals. An additional benefit is that negating an Int would never crash.
>> 
>> Well the “top value” for signed ints would have to be 0x8000... not 0xffff... which is the representation of -1. The top value for unsigned ints cannot be banned because unsigned integers are often used as bit fields either directly or in OptionSets.
>> 
>> Furthermore, how would the semantics of &+ and &- be affected? What about the performance of those two operators?
>> 
> I was originally going for the symmetry between Int and UInt as in compatible bit patterns. Now that I think of it, UInt is commonly used for bitwise operations, and it doesn't make sense to optimize for "UInt?" which is uncommon. So I agree that 0x80... is better.
> 
> Int performance would surely suffer because of current instruction sets, but Int? would improve.

I wouldn’t want to trade Int performance off against Int? performance. I think the former is much more common. 

> 
>>> 
>>> So what do you think? Can we break C compatibility a bit for better Swift types?
>> 
>> 
>> Well it’s not just C compatibility, it’s underlying processor compatibility. And actually, yes, I think C compatibility is vastly more important than being able to make your [Int?] arrays smaller considering that full 2’s complement numbers is what the OS calls and libc calls are expecting.
>> 
> Yes, that is also the result Joe said of their previous internal discussion. Anyway, I know this is improbable, and I'm just glad that this possibility is considered.

I agree that it’s important to discuss these ideas. When you proposed this, my first reaction was “this is crazy” but reading the rationale and other posts, made me realise that my reaction was almost just a reflexive reaction along the lines of “it’s always been this way, why change?”. Your post forced me to sit down and think about why it should or shouldn’t be implemented in Swift. As you can see, my final position didn’t change, but you made me think and whichever way the discussion eventually goes, that’s a good thing.


> 
> - Guoye



More information about the swift-evolution mailing list