apeforest commented on a change in pull request #15109: [DOC] refine autograd 
docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r293482529
 
 

 ##########
 File path: docs/api/python/autograd/autograd.md
 ##########
 @@ -76,7 +82,63 @@ Detailed tutorials are available in Part 1 of
 [the MXNet gluon book](http://gluon.mxnet.io/).
 
 
+# Higher order gradient
+
+Some operators support higher order gradients. Meaning that you calculate the 
gradient of the
+gradient. For this the operator's backward must be differentiable as well. 
Some operators support
+differentiating multiple times, and others two, most just once.
+
+For calculating higher order gradients, we can use the `mx.autograd.grad` 
function while recording
+and then call backward, or call `mx.autograd.grad` two times. If we do the 
latter, is important that
+the first call uses `create_graph=True` and `retain_graph=True` and the second 
call uses
+`create_graph=False` and `retain_graph=True`. Otherwise we will not get the 
results that we want. If
+we would be to recreate the graph in the second call, we would end up with a 
graph of just the
+backward nodes, not the full initial graph that includes the forward nodes.
+
+The pattern to calculate higher order gradients is the following:
+
+```python
+from mxnet import ndarray as nd
+from mxnet import autograd as ag
+x=nd.array([1,2,3])
 
 Review comment:
   Add space between "="

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to