OneRaynyDay opened a new pull request #11833: [MXNET-688] Fix quantization 
divide by zero errors
URL: https://github.com/apache/incubator-mxnet/pull/11833
 
 
   ## Description ##
   The current quantization strategy for `calib_mode='entropy'` is to calculate 
the KL divergence for different thresholds and choose the best threshold. This 
assumes that the random variable is nonzero for all reals and is a continuous 
random variable. Because we are discretizing the distribution, we smooth the 
distribution over the range `[-threshold, threshold]`. What we are not 
considering is that the entire sampled distribution may be not in the range 
`[-threshold, threshold]` and thus we end up with all zeros in the sampled 
candidate `p` distribution inside of `_get_optimal_threshold`.
   
   I have added a check that the distribution(possibly unnormalized) is proper 
before attempting to smooth or else we'll run into a divide by 0 error.
   
   In most cases, activation functions and layers for classification type 
problems output numbers symmetric around 0. This is not the case for a 
regressor's last layer, and there are various other examples where the 
activation distribution is not around 0, and this was a major blockage for 
airbnb's adoption into mxnet's quantization capabilities.
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [ ] The PR title starts with [MXNET-$JIRA_ID], where $JIRA_ID refers to 
the relevant [JIRA issue](https://issues.apache.org/jira/projects/MXNET/issues) 
created (except PRs with tiny changes)
   - [ ] Changes are complete (i.e. I finished coding on this PR)
   - [ ] All changes have test coverage:
   - Unit tests are added for small changes to verify correctness (e.g. adding 
a new operator)
   - Nightly tests are added for complicated/long-running ones (e.g. changing 
distributed kvstore)
   - Build tests will be added for build configuration changes (e.g. adding a 
new build option with NCCL)
   - [ ] Code is well-documented: 
   - For user-facing API changes, API doc string has been updated. 
   - For new C++ functions in header files, their functionalities and arguments 
are documented. 
   - For new examples, README.md is added to explain the what the example does, 
the source of the dataset, expected performance on test set and reference to 
the original paper if applicable
   - Check the API doc at 
http://mxnet-ci-doc.s3-accelerate.dualstack.amazonaws.com/PR-$PR_ID/$BUILD_ID/index.html
   - [ ] To the my best knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - Added tests and extra code inside of quantization.py to set all kl 
divergence to infinity if the probability distribution formed is all zeros. 
Also there are some typos and one-off-errors in the original implementation 
that are now fixed.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to