anirudhacharya commented on issue #11865: attach_grad of intermediate variables causes the gradient graph to be lost URL: https://github.com/apache/incubator-mxnet/issues/11865#issuecomment-514317891 And as per the autograd documentation here - https://www.d2l.ai/chapter_crashcourse/autograd.html#attach-gradients-to-internal-variables it would seem we are expecting the computation graph to be thrown away when we execute `x.attach_grad()` because we are implicitly running `detach()` every time `attach_grad()` is called. We need to get a clear understanding of what the expected behavior is here.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
