[GitHub] lightingghost opened a new issue #10002: _backward_Convolution is non-differentiable

2018-03-06 Thread GitBox
lightingghost opened a new issue #10002: _backward_Convolution is non-differentiable URL: https://github.com/apache/incubator-mxnet/issues/10002 As I saw mxnet has `autograd` package to support high order gradients, I tried to implement wgan-gp with mxnet. But I got an error of ```

[GitHub] lightingghost opened a new issue #10002: _backward_Convolution is non-differentiable

2018-03-06 Thread GitBox
lightingghost opened a new issue #10002: _backward_Convolution is non-differentiable URL: https://github.com/apache/incubator-mxnet/issues/10002 As I saw mxnet has `autograd` package to support high order gradients, I tried to implement wgan-gp with mxnet. But I got an error of ```