Mon, 11 Jul 2016 13:01:49 -0400, Jason Newton kirjoitti:
> Does the ML have any ideas on how one could get a matmul that will not
> allow any funny business on the evaluation of the products?  Funny
> business here is something like changing the evaluation order additions
> of terms. I want strict IEEE 754 compliance - no 80 bit registers, and
> perhaps control of the rounding mode, no unsafe math optimizations.

If you link Numpy with a BLAS and LAPACK libraries that have been 
compiled for this purpose, and turn on the compiler flags that enforce 
strict IEEE (and disable SSE) when compiling Numpy, you probably will get 
reproducible builds. Numpy itself just offloads the dot computations to 
BLAS, so if your BLAS is reproducible, things should mainly be OK.

You may also need to turn off the SSE optimizations in Numpy, because 
these can make results depend on memory alignment --- not in dot 
products, but in other computations.

Out of curiosity, what is the application where this is necessary?
Maybe there is a numerically stable formulation?

-- 
Pauli Virtanen

_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to