I'd like to know what people get when they compute

sage: maxima('asinh(2.0)')

For every machine I've tested it on,

* sage.math (Linux)
* bsd.math (OS X)
* t2.math (Solaris 10 on SPARC)
* fulvia @ skynet (Solaris 10 on x86)
* My Ultra 27 (OpenSolaris on x86)

they all give the same result

sage: maxima('asinh(2.0)')
1.44363547517881

which does happen to be correct in all digits (a higher precision result is 1.44363547517881034249327674027).

I'd be intersted however if anyone gets anything different, which might happen with different floating point processors, so far all machines I've tested on give the same result.

Dave

--
To post to this group, send an email to [email protected]
To unsubscribe from this group, send an email to 
[email protected]
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to