rahul003 commented on a change in pull request #10183: [MXNET-120] Float16 
support for distributed training
URL: https://github.com/apache/incubator-mxnet/pull/10183#discussion_r180322363
 
 

 ##########
 File path: amalgamation/Makefile
 ##########
 @@ -23,6 +23,11 @@ ifndef OPENBLAS_ROOT
     export OPENBLAS_ROOT=/usr/local/opt/openblas
 endif
 
+# use F16C if the architecture supports it, turned off by default
+ifndef USE_F16C
 
 Review comment:
   added better comment. F16C is an instruction set extension supported by 
newer x86 CPUs. It provides intrinsics for faster fp16 compute.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to