aaronmarkham commented on a change in pull request #13160: Fix Sphinx python
docstrings
URL: https://github.com/apache/incubator-mxnet/pull/13160#discussion_r231690878
##########
File path: python/mxnet/optimizer/optimizer.py
##########
@@ -692,20 +692,19 @@ class LBSGD(Optimizer):
Parameters
----------
momentum : float, optional
- The momentum value.
+ The momentum value.
multi_precision: bool, optional
- Flag to control the internal precision of the optimizer.
- ``False`` results in using the same precision as the weights (default),
- ``True`` makes internal 32-bit copy of the weights and applies gradients
- in 32-bit precision even if actual weights used in the model
have lower precision.`<
- Turning this on can improve convergence and accuracy when
training with float16.
+ Flag to control the internal precision of the optimizer.
+ ``False`` results in using the same precision as the weights (default),
+ ``True`` makes internal 32-bit copy of the weights and applies
gradients
Review comment:
I'd like to see the output here and if there's some directive that can be
used to make the formatting happen as expected.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services