formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-332083021
Closed it now. Consider this later if needed. Thanks.
This is an
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331806475
@sxjscience `rho` is `lambda` in the paper
```
Kingma, Diederik, and Jimmy Ba. ?Adam: A method for stochastic
optimization.? arXiv
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331798138
@sxjscience `rho` is the `lambda` in the paper
This is an
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331798138
@sxjscience `rho` is the `lambda` in the paper
This is an
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331061537
In my use, namely training `field aware factorization machine` on 20
millions features and billions instances, setting `rho` 0.999 really
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331035824
@piiswrong I have set the default `rho` 1.0. It could have the same result
as the origin master but also give a chance for users to set
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-330748814
@sergeykolychev Have been merged.
@piiswrong Jenkins error: libdc1394 error: Failed to initialize libdc1394
formath commented on issue #7942: Adam optimizer consistent with paper
URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-330748814
@sergeykolychev Have been merged.
@piiswrong
This is an automated