lightingghost opened a new issue #10002: _backward_Convolution is 
non-differentiable
URL: https://github.com/apache/incubator-mxnet/issues/10002
 
 
   As I saw mxnet has `autograd` package to support high order gradients, I 
tried to implement wgan-gp with mxnet. But I got an error of 
   
   ```
   mxnet.base.MXNetError: [08:32:17] 
C:\projects\mxnet-distro-win\mxnet-build\nnvm\src\pass\gradient.cc:187: 
Operator _backward_Convolution is non-differentiable because it didn't register 
FGradient attribute.
   ```
   
   It seems convolution operator still does not support higher order gradients?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to