barry-jin opened a new pull request #20426:
URL: https://github.com/apache/incubator-mxnet/pull/20426


   ## Description ##
   Optimizers in MXNet2.0 are refactored and some have new keyword name for the 
parameters, like adagrad(eps->epsilon), rmsprop(gamma1->rho, gamma2->momentum). 
It will be confusing for users who migrate from v1.x to find that using rmsprop 
could raise the following error: 
   ```
   Traceback (most recent call last):
     File "optimizer_update.py", line 16, in <module>
       rmsprop_optimizer = optimizer.RMSProp(learning_rate=0.001, gamma1=0.9, 
gamma2=0.9, epsilon=1e-07, centered=False)
     File 
"/home/ubuntu/workspace/incubator-mxnet/python/mxnet/optimizer/rmsprop.py", 
line 73, in __init__
       super(RMSProp, self).__init__(learning_rate=learning_rate,
     File 
"/home/ubuntu/workspace/incubator-mxnet/python/mxnet/optimizer/optimizer.py", 
line 96, in __init__
       super(Optimizer, self).__init__(**kwargs)
   TypeError: object.__init__() takes exactly one argument (the instance to 
initialize)
   ```
   This PR will add warnings for deprecated parameters in the following 
optimizers: 
   1. adagrad: 
       - eps -> epsilon
   2. rmsprop:
       - gamma1 -> rho
       - gamma2 -> momentum   
   
   ## Checklist ##
   ### Essentials ###
   - [x] PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], 
[FEATURE], [DOC], etc)
   - [x] Changes are complete (i.e. I finished coding on this PR)
   - [x] All changes have test coverage
   - [x] Code is well-documented
   
   ## Comments ##
   - If this change is a backward incompatible change, why must this change be 
made.
   - Interesting edge cases to note here
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to