Please forgive me if this have already been hashed out.  Google and 
DuckDuckGo have not been helpful in my research, and the archives do not have a 
search functionality.

        I am new to Swift, and starting from 3.0 beta.  I am aware that the 
language is not stabilized, so I feel this is an opportunity to introduce a 
native data type.

        Over history, data types typically follows what the CPU supports, so 
hence the int/float datatypes.  char/String is technically an integer/array of 
integers under the hood (and unicode-supported Strings are a bit more 
complicated than that).  Among customers I work with, data accuracy is very 
important.  So when the only option for a value with a fractional portion does 
not guarantee data accuracy, such as:

return 3.0 * sideLength - 9.300000000001

        It gives us pause, as floating point numbers are always understood to 
be inaccurate, and the whole song and dance required to truncate/round the 
number to ensure accurate results make our financial customers very nervous.  
They want accuracy from the start and whenever they audit the results in the 
middle of the process, they expect to see a 9.3 on the dot.  

        Having a native Decimal datatype would ensure that we have a consistent 
handling option for pulling and putting data to databases.  Having worked with 
APT_Decimal within Orchestrate/PX Engine, I know we can do a better job if we 
could leverage the speed of float, but somehow ensure that even with that 
speed, we preserve the accuracy of the actual data in spite of the CPU’s quirks 
in handling floating point numbers in a maddening inaccurate manner.

        Is there any particular reason why we do not have a native Decimal data 
type for Swift?  

-T.J.

_______________________________________________
swift-users mailing list
swift-users@swift.org
https://lists.swift.org/mailman/listinfo/swift-users

Reply via email to