marcoabreu commented on a change in pull request #16487: Fix learning rate 
scheduler being unexpectedly overwritten by optimizer's default value
URL: https://github.com/apache/incubator-mxnet/pull/16487#discussion_r334958118
 
 

 ##########
 File path: python/mxnet/optimizer/optimizer.py
 ##########
 @@ -97,14 +99,19 @@ class Optimizer(object):
         optimizer, its learning rate can be accessed as 
optimizer.learning_rate.
     """
     def __init__(self, rescale_grad=1., param_idx2name=None, wd=0.,
-                 clip_gradient=None, learning_rate=0.01,
+                 clip_gradient=None, learning_rate=None,
                  lr_scheduler=None, sym=None, begin_num_update=0,
                  multi_precision=False, param_dict=None):
         self.rescale_grad = rescale_grad
-        self.lr = learning_rate
         self.lr_scheduler = lr_scheduler
-        if lr_scheduler is not None:
-            self.lr_scheduler.base_lr = learning_rate
+        if self.lr_scheduler is None and learning_rate is None:
+            learning_rate = 0.01
+        self.lr = learning_rate
+        if self.lr_scheduler is not None and learning_rate is not None:
+            if self.lr_scheduler.base_lr != learning_rate:
+                raise UserWarning("learning rate from ``lr_scheduler`` has 
been "
 
 Review comment:
   Wouldn't this terminate the execution and start the exception handling? If 
so, the statement below wouldn't be executed

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to