samskalicky commented on issue #12091: [MXNET-792] Fix for issue #9816 with 
dropout operator and RNG
URL: https://github.com/apache/incubator-mxnet/pull/12091#issuecomment-413056672
 
 
   @eric-haibin-lin I did some more investigation and it seems that the CPU RNG 
ranges from (0,1] rather than the [0,1) as initially thought. I was able to get 
random values of 1.0 using seed 976064129 for the test_operator.py:test_dropout 
on the CPU. Meaning that if dropout=0, then pkeep=1 (goal is to keep 
everything, no dropout) and the value will get dropped with the random value of 
1.0 when using less-than thresholding. Changing the code to use <= for GPU and 
< for CPU doesnt work. 
   
   If dropout=1, then pkeep=0. This triggers the 2nd term in the mask 
computation: (1.0f/pkeep). In this case this term is now NaN (divide by zero) 
propagating out to the output values. 
   
   The current code that is checked in where <= is used for both CPU and GPU is 
what is currently valid and currently passing for both CPU and GPU test_dropout 
evaluations.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to