cjolivier01 closed pull request #9779: add missed optimizer docs
URL: https://github.com/apache/incubator-mxnet/pull/9779
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/docs/api/python/optimization/optimization.md 
b/docs/api/python/optimization/optimization.md
index e333b0076e..7d6276d4bd 100644
--- a/docs/api/python/optimization/optimization.md
+++ b/docs/api/python/optimization/optimization.md
@@ -96,8 +96,14 @@ implements one weight updating function.
     Adam
     AdaGrad
     AdaDelta
+    Adamax
+    Nadam
     DCASGD
     SGLD
+    Signum
+    FTML
+    LBSGD
+    Ftrl
 ```
 
 ## The ``mxnet.lr_scheduler`` package
diff --git a/python/mxnet/optimizer.py b/python/mxnet/optimizer.py
index 06527723a2..065c08cee4 100644
--- a/python/mxnet/optimizer.py
+++ b/python/mxnet/optimizer.py
@@ -68,8 +68,8 @@ class Optimizer(object):
        Flag to control the internal precision of the optimizer.
        ``False`` results in using the same precision as the weights (default),
        ``True`` makes internal 32-bit copy of the weights and applies gradients
-                in 32-bit precision even if actual weights used in the model 
have lower precision.
-                Turning this on can improve convergence and accuracy when 
training with float16.
+       in 32-bit precision even if actual weights used in the model have lower 
precision.
+       Turning this on can improve convergence and accuracy when training with 
float16.
 
     Properties
     ----------


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to