bgawrych commented on a change in pull request #20813:
URL: https://github.com/apache/incubator-mxnet/pull/20813#discussion_r793737572



##########
File path: 
docs/python_docs/python/tutorials/getting-started/gluon_migration_guide.md
##########
@@ -432,6 +432,67 @@ A new module called `mxnet.gluon.probability` has been 
introduced in Gluon 2.0.
 
 3. 
[Transformation](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/gluon/probability/transformation):
 implement invertible transformation with computable log det jacobians.
 
+##  oneDNN Integration
+### Operator Fusion
+In versions 1.x of MXNet pattern fusion in execution graph was enabled by 
default when using MXNet built with oneDNN library support and could have been 
disabled by setting 'MXNET_SUBGRAPH_BACKEND' environment flag to `None`. MXNet 
2.0 introduced changes in forward inference flow which led to refactor of 
fusion mechanism. To fuse model in MXNet 2.0 there are two requirements:
+
+ - the model must be defined as a subclass of HybridBlock or Symbol
+
+ - the model must have specific operator patterns which can be fused
+
+Both HybridBlock and Symbol classes provide API to easily run fusion of 
operators. All we have to do is to add single line of code running fusion 
passes on our model:

Review comment:
       ```suggestion
   Both HybridBlock and Symbol classes provide API to easily run fusion of 
operators. Adding only one line of code is needed to run fusion passes on model:
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to