This is an automated email from the ASF dual-hosted git repository.

marcoabreu pushed a commit to branch v1.2.0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.2.0 by this push:
     new 3da2939  Mark ONNX-MXNet experimental (#10677)
3da2939 is described below

commit 3da2939bd6fe9a0e76f6fb13e8930614d7993f3e
Author: Anirudh <[email protected]>
AuthorDate: Wed Apr 25 11:35:06 2018 -0700

    Mark ONNX-MXNet experimental (#10677)
    
    * Mark ONNX-MXNet experimental
    
    * change wording.
    
    * space nit
---
 NEWS.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/NEWS.md b/NEWS.md
index 2b984c2..ed680ac 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -5,7 +5,7 @@ MXNet Change Log
 - Implemented new [Scala Inference 
APIs](https://cwiki.apache.org/confluence/display/MXNET/MXNetScalaInferenceAPI) 
which offer an easy-to-use, Scala Idiomatic and thread-safe high level APIs for 
performing predictions with deep learning models trained with MXNet (#9678). 
Implemented a new ImageClassifier class which provides APIs for classification 
tasks on a Java BufferedImage using a pre-trained model you provide (#10054). 
Implemented a new ObjectDetector class which provides APIs for  [...]
 
 ### New Features - Added a Module to Import ONNX models into MXNet
-- Implemented a new ONNX module in MXNet which offers an easy to use API to 
import ONNX models into MXNet's symbolic interface (#9963). Checkout the 
[example](https://github.com/apache/incubator-mxnet/blob/master/example/onnx/super_resolution.py)
 on how you could use this 
[API](https://cwiki.apache.org/confluence/display/MXNET/ONNX-MXNet+API+Design) 
to import ONNX models and perform inference on MXNet. 
+- Implemented a new ONNX module in MXNet which offers an easy to use API to 
import ONNX models into MXNet's symbolic interface (#9963). Checkout the 
[example](https://github.com/apache/incubator-mxnet/blob/master/example/onnx/super_resolution.py)
 on how you could use this 
[API](https://cwiki.apache.org/confluence/display/MXNET/ONNX-MXNet+API+Design) 
to import ONNX models and perform inference on MXNet. Currently, the ONNX-MXNet 
Import module is still experimental. Please use it with caution.
 
 ### New Features - Added Support for Model Quantization with Calibration
 - Implemented model quantization by adopting the [TensorFlow 
approach](https://www.tensorflow.org/performance/quantization) with calibration 
by borrowing the idea from Nvidia's 
[TensorRT](http://on-demand.gputechconf.com/gtc/2017/presentation/s7310-8-bit-inference-with-tensorrt.pdf).
 The focus of this work is on keeping quantized models (ConvNets for now) 
inference accuracy loss under control when compared to their corresponding FP32 
models. Please see the [example](https://github.com/ap [...]

-- 
To stop receiving notification emails like this one, please contact
[email protected].

Reply via email to