Zha0q1 edited a comment on issue #18926:
URL: 
https://github.com/apache/incubator-mxnet/issues/18926#issuecomment-676856554


   > Thanks Zhaoqi. If the issue turns out to be DLAMI open blas we should be 
able create an issue with AWS DLAMI team and get it resolved by bringing this 
issue to their notice.
   
   Will do by this week.
   
   > Outside DLAMI what is your summary ? Are you saying there will be no loss 
of precision? You Github iasue with openblas team says it is expected to have 
different behavior?
   
   The summary is that under the same version and configuration 32 and 64 
openblas will give consistent results. This is what we and the openblas team 
expect. 
   
   Both my 32 and 64 openblas builds have the same loss of precision. Openblas 
team explains that this is caused by the nature of float number computation and 
is not a bug. However, it appears that DLAMI ships with a special openblas 
binary which does not suffer from the same loss of precision.
   
   In other words, migrating from 32 to 64 openblas with other settings and 
version being identical should not break things. But it's possible that people 
are already getting different results if 1. they use opeblas binaries with 
different optimization levels and 2. they happen to have (large) computations 
that accumulates numerical error.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to