<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><br class=""><div><blockquote type="cite" class=""><div class="">On May 26, 2017, at 12:04 AM, Xiaodi Wu via swift-evolution <<a href="mailto:swift-evolution@swift.org" class="">swift-evolution@swift.org</a>> wrote:</div><br class="Apple-interchange-newline"><div class="">I've often wondered if even just "bitPattern" might suffice, as the truncating or extending of it should not be particularly surprising.</div></blockquote><br class=""></div><div>Being explicit about bit pattern truncation or extension is valuable. It helps catch bugs where the bit count is not what the author expected. In Swift it is especially important for types like Int and CGFloat which have platform-dependent sizes.</div><div><br class=""></div><div> let x: Int64 = …</div><div> let y = Int(bitPattern: x) // if truncation is implicit then this may be surprising on 32-bit platforms</div><div><br class=""></div><div> let a: UInt32 = …</div><div> let b = Int(bitPattern: a) // if zero- or sign-extension is implicit then this may be surprising on 64-bit platforms</div><div><br class=""></div><div><br class=""></div><div>-- </div><div>Greg Parker <a href="mailto:gparker@apple.com" class="">gparker@apple.com</a> Runtime Wrangler</div><div><br class=""></div><div><br class=""></div></body></html>