[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-25 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-332083021 Closed it now. Consider this later if needed. Thanks. This is an

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-25 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331806475 @sxjscience `rho` is `lambda` in the paper ``` Kingma, Diederik, and Jimmy Ba. ?Adam: A method for stochastic optimization.? arXiv

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-25 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331798138 @sxjscience `rho` is the `lambda` in the paper This is an

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-25 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331798138 @sxjscience `rho` is the `lambda` in the paper This is an

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-21 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331061537 In my use, namely training `field aware factorization machine` on 20 millions features and billions instances, setting `rho` 0.999 really

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-20 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-331035824 @piiswrong I have set the default `rho` 1.0. It could have the same result as the origin master but also give a chance for users to set

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-20 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-330748814 @sergeykolychev Have been merged. @piiswrong Jenkins error: libdc1394 error: Failed to initialize libdc1394

[GitHub] formath commented on issue #7942: Adam optimizer consistent with paper

2017-09-19 Thread git
formath commented on issue #7942: Adam optimizer consistent with paper URL: https://github.com/apache/incubator-mxnet/pull/7942#issuecomment-330748814 @sergeykolychev Have been merged. @piiswrong This is an automated