starimpact opened a new issue #15102: the grad of lars should be scaled in lbsgd URL: https://github.com/apache/incubator-mxnet/issues/15102 ```python 776 def _get_lars(self, weight, g, wd): 777 """Returns a scaling factor for the learning rate for this layer 778 default is 1 779 """ 780 weight2 = self._l2norm(weight) 781 grad2 = self._l2norm(g) 782 grad2 = grad2*self.rescale_grad 783 lars = math.sqrt(weight2 / (grad2 + wd * weight2 + 1e-18)) 784 if lars < 0.01: 785 lars = 0.01 786 elif lars > 100: 787 lars = 100 788 return lars ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
