On 11/11/2017 06:05 AM, Ari Jolma wrote:
I'm making a data request on the corner of the bounding box, let's say it has minimum X of 75042.7273594. I'm setting my minX to that value and I'm enforcing it to that value with MAX (this was introduced because of a case with ArcGIS). The I print that to the request with "%.18g" (it was "%.15g" earlier but I changes it to that because of ArcGIS) and the result is 75042.7273593999998, which is not good for Rasdaman, since it is formally less than 75042.7273594. Although, in gdb

(gdb) p 75042.7273594 > 75042.727359399999
$7 = false

Any ideas how to detect/prevent these kinds of situations? Keep the checks and go back to "%.15g"? Not ok with ArcGIS.
All modern systems use IEEE 754 double-precision floating-point representation.  This has a mantissa size of 53 bits which translates to a nominal decimal precision of 15.95 digits, but the actual available precision wobbles between 15 and 17 decimal digits depending on the number being represented.  You will never be able to get 18 "true" digits of precision.

What I do in general to compensate for the kind of error you are running into is generate the value with "%.16g" and then check the 15th and 16th generated digits (if they are actually generated) for the pattern "99" or "01".  If either pattern is present, I re-generate the value with "%.15g".  This generally gives you an extra digit of precision without the problem of adding extraneous "00001" or "99999" patterns to numbers that started out being relatively round in decimal.

--
Dr. Craig S. Bruce
Senior Software Developer
CubeWerx Inc.
http://www.cubewerx.com
_______________________________________________
gdal-dev mailing list
[email protected]
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to