apeforest commented on issue #14613: [MXNET-978] Second order gradient support for some unary operators URL: https://github.com/apache/incubator-mxnet/pull/14613#issuecomment-499695476 To summarize from discussion in https://github.com/apache/incubator-mxnet/pull/15120, I think in MXNet we calculate input gradients based on the variables specified in the autograd.grad() API. In this case y_grad, even if someone manually attach a grad attribute to it will not be assigned gradient values during the second pass of `backward()`. Only the second order gradient x.grad is preserved. This behavior is different from PyTorch, but I think it is not a bug of MXNet.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
