bartekkuncer commented on a change in pull request #20813:
URL: https://github.com/apache/incubator-mxnet/pull/20813#discussion_r787586914



##########
File path: 
docs/python_docs/python/tutorials/getting-started/gluon_migration_guide.md
##########
@@ -432,6 +432,67 @@ A new module called `mxnet.gluon.probability` has been 
introduced in Gluon 2.0.
 
 3. 
[Transformation](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/gluon/probability/transformation):
 implement invertible transformation with computable log det jacobians.
 
+##  oneDNN Integration
+### Operator Fusion
+In versions 1.x of MXNet pattern fusion in execution graph was enabled by 
default when using MXNet built with oneDNN library support and could have been 
disabled by setting 'MXNET_SUBGRAPH_BACKEND' environment flag to `None`. MXNet 
2.0 introduced changes in forward inference flow which led to refactor of 
fusion mechanism. To fuse model in MXNet 2.0 there are two requirements:
+
+ - the model must be defined as a subclass of HybridBlock or Symbol
+
+ - the model must have specific operator patterns which can be fused

Review comment:
       ```suggestion
    - the model must have specific operator patterns which can be fused.
   ```

##########
File path: 
docs/python_docs/python/tutorials/getting-started/gluon_migration_guide.md
##########
@@ -432,6 +432,67 @@ A new module called `mxnet.gluon.probability` has been 
introduced in Gluon 2.0.
 
 3. 
[Transformation](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/gluon/probability/transformation):
 implement invertible transformation with computable log det jacobians.
 
+##  oneDNN Integration
+### Operator Fusion
+In versions 1.x of MXNet pattern fusion in execution graph was enabled by 
default when using MXNet built with oneDNN library support and could have been 
disabled by setting 'MXNET_SUBGRAPH_BACKEND' environment flag to `None`. MXNet 
2.0 introduced changes in forward inference flow which led to refactor of 
fusion mechanism. To fuse model in MXNet 2.0 there are two requirements:
+
+ - the model must be defined as a subclass of HybridBlock or Symbol
+
+ - the model must have specific operator patterns which can be fused
+
+Both HybridBlock and Symbol classes provide API to easily run fusion of 
operators. All we have to do is to add single line of code running fusion 
passes on our model:

Review comment:
       Only sentence written in first person.

##########
File path: python/mxnet/contrib/quantization.py
##########
@@ -174,15 +174,16 @@ def __init__(self):
     def collect(self, name, op_name, arr):
         """Function which is registered to Block as monitor callback. Names of 
layers
         requiring calibration are stored in `self.include_layers` variable.
-            Parameters
-            ----------
-            name : str
-                Node name from which collected data comes from
-            op_name : str
-                Operator name from which collected data comes from. Single 
operator
-                can have multiple inputs/ouputs nodes - each should have 
different name
-            arr : NDArray
-                NDArray containing data of monitored node
+
+        Parameters
+        ----------
+        name : str
+            Node name from which collected data comes from
+        op_name : str
+            Operator name from which collected data comes from. Single operator
+            can have multiple inputs/ouputs nodes - each should have different 
name

Review comment:
       ```suggestion
               can have multiple input/ouput nodes - each should have different 
name.
   ```
   

##########
File path: python/mxnet/contrib/quantization.py
##########
@@ -174,15 +174,16 @@ def __init__(self):
     def collect(self, name, op_name, arr):
         """Function which is registered to Block as monitor callback. Names of 
layers
         requiring calibration are stored in `self.include_layers` variable.
-            Parameters
-            ----------
-            name : str
-                Node name from which collected data comes from
-            op_name : str
-                Operator name from which collected data comes from. Single 
operator
-                can have multiple inputs/ouputs nodes - each should have 
different name
-            arr : NDArray
-                NDArray containing data of monitored node
+
+        Parameters
+        ----------
+        name : str
+            Node name from which collected data comes from

Review comment:
       ```suggestion
               Node name from which collected data comes from.
   ```

##########
File path: python/mxnet/contrib/quantization.py
##########
@@ -174,15 +174,16 @@ def __init__(self):
     def collect(self, name, op_name, arr):
         """Function which is registered to Block as monitor callback. Names of 
layers
         requiring calibration are stored in `self.include_layers` variable.
-            Parameters
-            ----------
-            name : str
-                Node name from which collected data comes from
-            op_name : str
-                Operator name from which collected data comes from. Single 
operator
-                can have multiple inputs/ouputs nodes - each should have 
different name
-            arr : NDArray
-                NDArray containing data of monitored node
+
+        Parameters
+        ----------
+        name : str
+            Node name from which collected data comes from
+        op_name : str
+            Operator name from which collected data comes from. Single operator
+            can have multiple inputs/ouputs nodes - each should have different 
name
+        arr : NDArray
+            NDArray containing data of monitored node

Review comment:
       ```suggestion
               NDArray containing data of monitored node.
   ```

##########
File path: 
docs/python_docs/python/tutorials/getting-started/gluon_migration_guide.md
##########
@@ -432,6 +432,67 @@ A new module called `mxnet.gluon.probability` has been 
introduced in Gluon 2.0.
 
 3. 
[Transformation](https://github.com/apache/incubator-mxnet/tree/master/python/mxnet/gluon/probability/transformation):
 implement invertible transformation with computable log det jacobians.
 
+##  oneDNN Integration
+### Operator Fusion
+In versions 1.x of MXNet pattern fusion in execution graph was enabled by 
default when using MXNet built with oneDNN library support and could have been 
disabled by setting 'MXNET_SUBGRAPH_BACKEND' environment flag to `None`. MXNet 
2.0 introduced changes in forward inference flow which led to refactor of 
fusion mechanism. To fuse model in MXNet 2.0 there are two requirements:
+
+ - the model must be defined as a subclass of HybridBlock or Symbol

Review comment:
       ```suggestion
    - the model must be defined as a subclass of HybridBlock or Symbol,
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to