fhieber commented on a change in pull request #7395: Drafted documentation for 
autograd. 
URL: https://github.com/apache/incubator-mxnet/pull/7395#discussion_r132285955
 
 

 ##########
 File path: docs/api/python/autograd.md
 ##########
 @@ -9,6 +9,64 @@
 .. warning:: This package is currently experimental and may change in the near 
future.
 ```
 
+## Overview
+
+The `autograd` package enables automatic
+differentiation of NDArray operations.
+In machine learning applications,
+`autograd` is often used to calculate the gradients
+of loss functions with respect to parameters.
+
+
+### Record vs Pause
+
+`autograd` records computation history on the fly to calculate gradients later.
+This is only enabled inside a `with autograd.record():` block.
+A `with auto_grad.pause()` block can be used inside a `record()` block
+to temporarily disable recording.
+
+To compute gradient with respect to an `NDArray` `x`, first call 
`x.attach_grad()`
+to allocate space for the gradient. Then, start a `with autograd.record()` 
block,
+and do some computation. Finally, call `backward()` on the result:
+
+```python
+>>> x = mx.nd.array([1,2,3,4])
+>>> x.attach_grad()
+>>> with mx.autograd.record():
+...     y = x * x + 1
+>>> print(x.grad)
+[ 2.  4.  6.  8.]
+<NDArray 4 @cpu(0)>
+```
+
+
+## Train mode and Predict Mode
+
+Some operators (Dropout, BatchNorm, etc) behave differently in
+when training and when making predictions.
+This can be controled with `train_mode` and `predict_mode` scope.
 
 Review comment:
   typo: controled -> controlled
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to