<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class="">I agree on your point — base-10 notation is not compatible with a base-2 infrastructure. However, that is just exposing the reality of the floating point data type. It still does not answer my question on why we can’t just provide a decimal data type. <div class=""><br class=""></div><div class="">I am aware of the NSDecimalNumber class. But that is a layer on top of the core language, and not very performant relatively speaking - at least I would presume? Please correct me if I am wrong.</div><div class=""><br class=""></div><div class="">-T.J.</div><div class=""><br class=""></div><div class=""><br class=""><div><blockquote type="cite" class=""><div class="">On Sep 12, 2016, at 10:26, Jens Alfke <<a href="mailto:jens@mooseyard.com" class="">jens@mooseyard.com</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><meta http-equiv="Content-Type" content="text/html charset=utf-8" class=""><div style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class=""><br class=""><div class=""><blockquote type="cite" class=""><div class="">On Sep 12, 2016, at 10:10 AM, Teej . via swift-users <<a href="mailto:swift-users@swift.org" class="">swift-users@swift.org</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div class="" style="font-family: Alegreya-Regular; font-size: 15px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;"><span class="Apple-tab-span" style="white-space: pre;">        </span>…in spite of the CPU’s quirks in handling floating point numbers in a maddening inaccurate manner.</div></div></blockquote><div class=""><br class=""></div><div class="">Well, in the CPU’s defense, it’s only inaccurate because the puny humans insist on dividing their currency into fractions of 1/10, which has no exact representation in binary. (Apparently this is an ancient tradition commemorating the number of bony outgrowths on certain extremities of their grotesque meat-bodies.) I could — I mean, <i class="">the computers </i>could — point out that if we divided our currency units into 7 pieces, our precious decimal numbers would quickly become inaccurate too. :)</div><br class=""><blockquote type="cite" class=""><div class="" style="font-family: Alegreya-Regular; font-size: 15px; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px; -webkit-text-stroke-width: 0px;"><span class="Apple-tab-span" style="white-space: pre;">        </span>Is there any particular reason why we do not have a native Decimal data type for Swift? </div></blockquote></div><br class=""><div class="">Cocoa’s Foundation framework has an NSDecimalNumber class that provides decimal numbers and arithmetic. The class docs for that include a note that "The Swift overlay to the Foundation framework provides the Decimal structure, which bridges to the NSDecimalNumber class. The Decimalvalue type offers the same functionality as the NSDecimalNumberreference type, and the two can be used interchangeably in Swift code that interacts with Objective-C APIs."</div><div class=""><br class=""></div><div class="">The question is whether this has been ported to the in-progress native Swift foundation library yet. I haven’t checked.</div><div class=""><br class=""></div><div class="">—Jens</div></div></div></blockquote></div><br class=""></div></body></html>