PistonY opened a new issue #12529: Operator _backward_FullyConnected is 
non-differentiable because it didn't register FGradient attribute.
URL: https://github.com/apache/incubator-mxnet/issues/12529
 
 
   I'm trying to implement [WGAN-GP](https://arxiv.org/pdf/1704.00028.pdf) in 
Gluon.There is a "gradient_penalty" in this paper.I wrote it as this
   ```python
   def calc_gradient_penalty(netD, real_data, fake_data, LAMBDA, ctx):
       real_data = real_data.as_in_context(ctx)
       b_s = real_data.shape[0]
       alpha = nd.random.uniform(0, 1, shape=(b_s, 1), ctx=ctx)
       alpha = alpha.broadcast_to(real_data.shape)
       interpolates = alpha * real_data + ((1 - alpha) * fake_data)
   
       interpolates = nd.array(interpolates, ctx=ctx)
       interpolates.attach_grad()
       disc_interpolates = netD(interpolates)
       gradients = autograd.grad(heads=disc_interpolates, 
variables=interpolates,
                                 
head_grads=nd.ones(shape=disc_interpolates.shape, ctx=ctx),
                                 create_graph=True, retain_graph=True, 
train_mode=True)[0]
   
       gradients = gradients.reshape((gradients.shape[0], -1))
       gradient_penalty = ((gradients.norm(2, axis=1, keepdims=True) - 1) ** 
2).mean() * LAMBDA
       return gradient_penalty
   
   ```
   but when I backward with this an error raised 'Operator 
_backward_FullyConnected is non-differentiable because it didn't register 
FGradient attribute.'
   complete code is 
[here](https://gist.github.com/PistonY/ec4cdc76335dcba74c457b6d22e55ebc)
   How can I solve it?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to