reminisce commented on issue #9552: [REQUEST FOR REVIEW | DO NOT MERGE] Model 
Quantization with Calibration
   @pengzhao-intel I will definitely let you know if there are breaking changes.
   For testing inference, you can use the script 
`example/quantization/` to generate quantized models 
(resnet-152 and inception w/ bn) and run the inference using 
`example/quantization/`. Remember to change the `ctx` to 
`mx.cpu` since it's currently default to `mx.gpu(0)`.

This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

With regards,
Apache Git Services

Reply via email to