This is an automated email from the ASF dual-hosted git repository.

haibin pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new d056bfd  Fixed 4 broken links (#9698)
d056bfd is described below

commit d056bfd2d4c3947f7a04a91be205f761cea0f362
Author: thinksanky <31976455+thinksa...@users.noreply.github.com>
AuthorDate: Tue Feb 6 09:17:06 2018 -0800

    Fixed 4 broken links (#9698)
    
    * Fixed 4 broken links
    
    * fixed pylint for long line disable and 1 broken link
---
 docs/faq/finetune.md          | 2 +-
 docs/faq/multi_devices.md     | 2 +-
 docs/tutorials/index.md       | 4 ++--
 python/mxnet/gluon/trainer.py | 4 ++--
 4 files changed, 6 insertions(+), 6 deletions(-)

diff --git a/docs/faq/finetune.md b/docs/faq/finetune.md
index 2c6c7e3..533c3ca 100644
--- a/docs/faq/finetune.md
+++ b/docs/faq/finetune.md
@@ -15,7 +15,7 @@ with these pretrained weights when training on our new task. 
This process is
 commonly called _fine-tuning_. There are a number of variations of fine-tuning.
 Sometimes, the initial neural network is used only as a _feature extractor_.
 That means that we freeze every layer prior to the output layer and simply 
learn
-a new output layer. In [another 
document](https://github.com/dmlc/mxnet-notebooks/blob/master/python/faq/predict.ipynb),
 we explained how to
+a new output layer. In [another 
document](https://github.com/dmlc/mxnet-notebooks/blob/master/python/how_to/predict.ipynb),
 we explained how to
 do this kind of feature extraction. Another approach is to update all of
 the network's weights for the new task, and that's the approach we demonstrate 
in
 this document.
diff --git a/docs/faq/multi_devices.md b/docs/faq/multi_devices.md
index 5d538bc..b9cb3ea 100644
--- a/docs/faq/multi_devices.md
+++ b/docs/faq/multi_devices.md
@@ -210,4 +210,4 @@ export PS_VERBOSE=1; python ../../tools/launch.py ...
 ### More
 
 - See more launch options by `python ../../tools/launch.py -h`
-- See more options of 
[ps-lite](http://ps-lite.readthedocs.org/en/latest/faq.html)
+- See more options of [ps-lite](https://ps-lite.readthedocs.io/en/latest)
diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md
index aca091c..3eff299 100644
--- a/docs/tutorials/index.md
+++ b/docs/tutorials/index.md
@@ -134,7 +134,7 @@ The Gluon and Module tutorials are in Python, but you can 
also find a variety of
 
 - [Imperative tensor operations on 
CPU/GPU](http://mxnet.incubator.apache.org/tutorials/basic/ndarray.html)
 
-- [NDArray 
Indexing](http://mxnet.incubator.apache.org/tutorials/basic/ndarray_indexing.html)
+- [NDArray Indexing](../tutorials/basic/ndarray_indexing.html)
 
 - [Symbol API](http://mxnet.incubator.apache.org/tutorials/basic/symbol.html)
 
@@ -174,7 +174,7 @@ The Gluon and Module tutorials are in Python, but you can 
also find a variety of
 
 <div class="applications">
 
-- [Connectionist Temporal 
Classification](http://mxnet.incubator.apache.org/tutorials/speech_recognition/ctc.html)
+- [Connectionist Temporal 
Classification](../tutorials/speech_recognition/ctc.html)
 
 - [Distributed key-value 
store](http://mxnet.incubator.apache.org/tutorials/python/kvstore.html)
 
diff --git a/python/mxnet/gluon/trainer.py b/python/mxnet/gluon/trainer.py
index 71c144f..c8822bb 100644
--- a/python/mxnet/gluon/trainer.py
+++ b/python/mxnet/gluon/trainer.py
@@ -16,7 +16,7 @@
 # under the License.
 
 # coding: utf-8
-# pylint: disable=
+# pylint: disable=line-too-long
 """Parameter optimizer."""
 __all__ = ['Trainer']
 
@@ -34,7 +34,7 @@ class Trainer(object):
         The set of parameters to optimize.
     optimizer : str or Optimizer
         The optimizer to use. See
-        `help 
<http://mxnet.io/api/python/optimization.html#the-mxnet-optimizer-package>`_
+        `help 
<http://mxnet.io/api/python/optimization/optimization.html#the-mxnet-optimizer-package>`_
         on Optimizer for a list of available optimizers.
     optimizer_params : dict
         Key-word arguments to be passed to optimizer constructor. For example,

-- 
To stop receiving notification emails like this one, please contact
hai...@apache.org.

Reply via email to