<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class=""><div class="">When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?</div><div class=""><br class=""></div><div class="">BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”?</div><div class=""><br class=""></div><div class="">
<div style="color: rgb(0, 0, 0); letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px; word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class=""><div class="">— </div><div class="">Daryle Walker<br class="">Mac, Internet, and Video Game Junkie<br class="">darylew AT mac DOT com </div></div>
</div>
<br class=""></body></html>