<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class="">Hey, swift-evolution. I want to draw attention to one of the oddest parts of the Objective-C importer today: NSUInteger. TLDR: NSUInteger is treated differently based on whether it comes from a system framework or a user-provided header file. I think this is silly and that we should treat it consistently everywhere, but I haven’t had time to go collect data to demonstrate that this is a safe change. Would someone like to take that on?<div class=""><br class=""></div><div class="">If so, read on. (Or jump to the last section, and read these “Background” sections later.)<br class=""><div class=""><br class=""></div><div class=""><b class="">## Background: Importing Integer Types from C</b></div><div class=""><div class=""><br class=""></div><div class="">As everyone is familiar, the importer maps certain “known” Objective-C types to the Swift types. This includes some mostly non-controversial mappings:</div><div class=""><br class=""></div><div class="">- Mapping fixed-sized integers: ‘int32_t' to ‘Int32'</div><div class="">- Mapping common C types to fixed-sized integers: ‘unsigned short’ to ‘UInt16’</div><div class="">- Mapping C’s ‘long’ to Swift's ‘Int’.*</div><div class="">- Mapping ‘intptr_t’ and ‘ptrdiff_t’ to ‘Int’ and ‘uintptr_t’ to ‘UInt'</div><div class="">- Mapping ‘NSInteger’ (and ‘CFIndex’) to ‘Int’</div><div class=""><br class=""></div><div class="">* ‘long’ is a pointer-sized integer on all common modern platforms except 64-bit Windows; we’ll have to do something different there. (‘CLong’ will always be the right type.)</div><div class=""><br class=""></div><div class="">And a few controversial ones:</div><div class=""><br class=""></div><div class="">- Both ‘size_t’ and ‘rsize_t’ are mapped to ‘Int’, not ‘UInt’. This is a pragmatic decision based on <b class="">Swift’s disallowing of mixed-sign arithmetic and comparisons</b>; if size_t and rsize_t really are used to represent sizes or counts in memory, they will almost certainly never be greater than Int.max. It’s definitely a tradeoff, though.</div><div class=""><br class=""></div><div class="">And finally we come to the strangest one, NSUInteger.</div><div class=""><br class=""></div><div class=""><b class="">## Background: NSUInteger</b></div><div class=""><br class=""></div><div class="">In (Objective-)C, NSUInteger is defined to be a word-sized unsigned integer without any stated purpose, much like uintptr_t. It conventionally gets used</div><div class=""><br class=""></div><div class="">1. to represent a size or index in a collection</div><div class="">2. as the base type of an enum defined with NS_OPTIONS</div><div class="">3. to store hash-like values</div><div class="">4. to store semantically-nonnegative 32-bit values, casually (as a compiler writer I’d suggest uint32_t instead)</div><div class="">5. to store semantically-nonnegative 64-bit values, casually (definitely not portable, would suggest uint64_t)</div><div class="">6. to store opaque identifiers known to be 32 or 64 bits (like 3, but either wasting space or non-portable)</div><div class=""><br class=""></div><div class="">(1) is actually the problematic case. Foundation fairly consistently uses NSUInteger for its collections, but UIKit and AppKit use NSInteger, with -1 as a common sentinel. In addition, the standard constant NSNotFound is defined to have a value of Int.max, so that it’s consistent whether interpreted as a signed or unsigned value.</div></div><div class=""><br class=""></div><div class="">For (2), the code really just wants a conveniently-sized unsigned value to use as a bitfield. In this case the importer consistently treats NSUInteger as UInt. We’re not going to talk about this case any more.</div><div class=""><br class=""></div><div class="">(3) is a lot like (2), except we don’t actually care what the sign of the value is. We just want to vary as many bits as possible when constructing a hash value; we don’t usually try to <i class="">sort</i> them (or add them, or compare them).</div><div class=""><br class=""></div><div class="">(4) is interesting; it’s entirely possible to have 32-bit counters that go past Int32.max. It’s not common, but it’s possible.</div><div class=""><br class=""></div><div class="">(5) seems much less likely than (4). Int64.max is <i class="">really</i> high, and if you’re already up in that range I’m not sure another bit will do you any good.</div><div class=""><br class=""></div><div class="">(6) is basically the same as (3); we don’t plan on interpreting these bits, and so we don’t <i class="">really</i> care what sign the type has.</div><div class=""><br class=""></div><div class="">Because of this, and <i class="">especially</i> because of the Foundation/*Kit differences, in Swift 1 we decided to <b class="">import NSUInteger as Int, but only in system frameworks</b> (and when not used as the raw type of an enum). In user frameworks, NSUInteger is consistently imported as UInt.</div><div class=""><br class=""></div><div class=""><b class="">## The Problem</b></div><div class=""><b class=""><br class=""></b></div><div class="">This is inconsistent. User frameworks should not have different rules from system frameworks. I’d like to propose that <b class="">NSUInteger be imported as Int everywhere</b> (except when used as the raw type of an enum). It’s not a perfect decision, but it is a pragmatic one, given Swift being much stricter about mixing signedness than C. I’d hope the logic above convinces you that it won’t be a disaster, either—it hasn’t been for Apple’s frameworks.</div><div class=""><br class=""></div><div class=""><div class="">The recommended idiom for “no, really a word-sized unsigned integer” would be ‘uintptr_t’, but unless you are actually trying to store a pointer as an integer it’s likely that uint32_t or uint64_t would be a better C type to use anyway.</div></div><div class=""><br class=""></div><div class=""><div class="">For people who would suggest that Swift actually take unsigned integers seriously instead of using ‘Int’ everywhere, I sympathize, but I think that ship has sailed—not with us, but with all the existing UIKit code that uses NSInteger for counters. Consistently importing NSUInteger as UInt would be a <i class="">massive</i> source-break in Swift 4 that just wouldn’t be worth it. Given that, is it better to more closely model what’s in user headers, or to have consistency between user and system headers?</div></div><div class=""><br class=""></div><div class="">(All of this would only apply to Swift 4 mode. Swift 3 compatibility mode would continue to do the funny thing Swift has always done.)</div><div class=""><br class=""></div><div class=""><b class="">## The Request</b></div><div class=""><br class=""></div><div class="">Consistently importing NSUInteger as Int would be a pretty major change to how we import existing Objective-C code, and has the potential to break all sorts of mixed-source projects, or even just projects with Objective-C dependencies (perhaps longstanding CocoaPods). Because of this, I’ve held off on proposing it for…a long time now. The last piece, I think, is to <b class="">find out how Objective-C projects are using NSUInteger in their headers:</b></div><div class=""><br class=""></div><div class="">- Do they have no NSUIntegers at all?</div><div class="">- Are they using NSUInteger because they’re overriding something that used NSUInteger, or implementing a protocol method that used NSUInteger?</div><div class="">- Are they using NSUInteger as an opaque value, where comparisons and arithmetic are uninteresting?</div><div class="">- Are they using NSUInteger as an index or count of something held in memory?</div><div class="">- Are they using NSUInteger as the raw value of an NS_OPTIONS enum?</div><div class="">- Or is it something else? (These are the most interesting cases, which we probably want to write down.)</div></div><div class=""><br class=""></div><div class="">If the answers all land in one of these buckets, or even 90% in one of these buckets, then I think we’d be safe in proposing this change; if it turns out there are many interesting uses I didn’t account for, then of course we won’t. But I do think we need to do this reaserch.</div><div class=""><br class=""></div><div class=""><b class="">Is someone willing to go look at modern CocoaPods and sample code, and ask other developers from major Swift-using companies, to find out how they’re using NSUInteger?</b> And then write up their methodology and report back to us at swift-evolution. If you do this, I will be quite grateful.</div><div class=""><br class=""></div><div class="">Thank you!</div><div class="">Jordan</div><div class=""><br class=""></div><div class="">P.S. For Apple folks, this is <a href="rdar://problem/20347922" class="">rdar://problem/20347922</a>.</div><div class=""><br class=""></div></body></html>