ThomasDelteil edited a comment on issue #11385: About loss
URL: 
https://github.com/apache/incubator-mxnet/issues/11385#issuecomment-402874669
 
 
   @yyl199655 for more information automatic differentiation (autograd) you can 
follow the Gluon Crash Course, in particular this 
[chapter](https://gluon-crash-course.mxnet.io/autograd.html)
   
   To expand on @szha explanation, you simply need to compute your loss 
function within the `with autograd.record():` scope and then you can call 
`loss.backward()` on your loss term to automatically compute the gradient of 
that loss with respect to the parameters that were used to compute it.
   
   @yyl199655 if you would want to follow up, please open a thread in 
https://discuss.mxnet.io, Thanks
   @szha could you please close the issue? Thanks!

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to