> On Sep 12, 2016, at 10:10 AM, Teej . via swift-users <swift-users@swift.org> 
> wrote:
> 
>       …in spite of the CPU’s quirks in handling floating point numbers in a 
> maddening inaccurate manner.

Well, in the CPU’s defense, it’s only inaccurate because the puny humans insist 
on dividing their currency into fractions of 1/10, which has no exact 
representation in binary. (Apparently this is an ancient tradition 
commemorating the number of bony outgrowths on certain extremities of their 
grotesque meat-bodies.) I could — I mean, the computers could — point out that 
if we divided our currency units into 7 pieces, our precious decimal numbers 
would quickly become inaccurate too. :)

>       Is there any particular reason why we do not have a native Decimal data 
> type for Swift?  

Cocoa’s Foundation framework has an NSDecimalNumber class that provides decimal 
numbers and arithmetic. The class docs for that include a note that "The Swift 
overlay to the Foundation framework provides the Decimal structure, which 
bridges to the NSDecimalNumber class. The Decimalvalue type offers the same 
functionality as the NSDecimalNumberreference type, and the two can be used 
interchangeably in Swift code that interacts with Objective-C APIs."

The question is whether this has been ported to the in-progress native Swift 
foundation library yet. I haven’t checked.

—Jens
_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

Reply via email to