#335: export floats and doubles with correct precision -----------------------+---------------------------------------------------- Reporter: hamish | Owner: grass-dev@… Type: task | Status: new Priority: critical | Milestone: 6.4.4 Component: Default | Version: svn-develbranch6 Keywords: precision | Platform: All Cpu: All | -----------------------+----------------------------------------------------
Comment(by glynn): Replying to [comment:26 hamish]: > > No, it is not. %.17,9g is reproducible. See IEEE 754 standard. > > I didn't mean reproducible as far as the binary stored value round-trip was concerned, I meant reproducible as far as getting the same ascii result on two different hardware platforms. All common platforms use IEEE-754 representation, so I wouldn't expect differences due to hardware. > My understanding, for what it is, is that the least sig. digits can flicker depending on the CPU arch, perhaps the compiler, and programming language's implementation too. Any differences are due to software, not hardware. The main reasons for differences are: 1. Whether the software produces the closest decimal value to the actual binary value, or the shortest decimal value which would convert to the actual binary value (those two aren't necessarily the same). 2. The rounding mode used in the event of a tie. E.g. 3.0/16=0.1875; if that is displayed with 3 fractional digits, should the result be 0.187 (round toward zero, round toward negative infinity) or 0.188 (round toward positive infinity, round toward nearest even value)? 3. Bugs. Many programmers don't understand the details of floating-point, and don't particularly care whether the least-significant digits are "correct" (particularly as that would requiring which flavour of "correct" you actually want). -- Ticket URL: <http://trac.osgeo.org/grass/ticket/335#comment:27> GRASS GIS <http://grass.osgeo.org> _______________________________________________ grass-dev mailing list grass-dev@lists.osgeo.org http://lists.osgeo.org/mailman/listinfo/grass-dev