ArmageddonKnight commented on a change in pull request #18228:
URL: https://github.com/apache/incubator-mxnet/pull/18228#discussion_r422371418



##########
File path: docs/static_site/src/pages/api/faq/env_var.md
##########
@@ -189,14 +189,13 @@ $env:MXNET_STORAGE_FALLBACK_LOG_VERBOSE=0
   - The maximum size of an NDArray slice in terms of number of parameters.
   - This parameter is used to slice an NDArray before synchronizing through 
P3Store (dist_p3).
 
-## Memonger
+## Memory Optimizations
 
-* MXNET_BACKWARD_DO_MIRROR
+* MXNET_MEMORY_OPT
   - Values: 0(false) or 1(true) ```(default=0)```
-  - MXNet uses mirroring concept to save memory. Normally backward pass needs 
some forward input and it is stored in memory but you can choose to release 
this saved input and recalculate it in backward pass when needed. This 
basically trades off the computation for memory consumption.
-  - This parameter decides whether to do `mirror` during training for saving 
device memory.
-  - When set to `1`, during forward propagation, graph executor will `mirror` 
some layer's feature map and drop others, but it will re-compute this dropped 
feature maps when needed.
-  - `MXNET_BACKWARD_DO_MIRROR=1` will save 30%~50% of device memory, but 
retains about 95% of running speed.
+  - When set to `1`, MXNet will adopt various approaches to reduce the memory 
consumption of the model. For example, it uses the mirroring concept to save 
memory: Normally the backward pass needs some forward inputs to compute the 
gradients. Those inputs have to be stashed in memory and persistent throughout 
the traianing process. However, you can choose to release those saved inputs 
and recalculate them in the backward pass when needed. This basically trades 
off the computation for memory consumption. When set to `1`, during forward 
propagation, the graph executor will `mirror` some layers' feature maps and 
drop others, but it will re-compute this dropped feature maps when needed.

Review comment:
       Hi @apeforest , I believe that your feedback has been addressed. Please 
let me know if you have any further concerns. Thank you.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to