sxjscience commented on issue #10002: General support of OPs for second-order 
gradient
URL: 
https://github.com/apache/incubator-mxnet/issues/10002#issuecomment-370929547
 
 
   @lightingghost  Borrow the discussion in 
https://github.com/apache/incubator-mxnet/issues/9979 here
   ```python
   import mxnet.ndarray as nd
   from mxnet import autograd
   
   x = nd.array([3.0])
   x.attach_grad()
   with autograd.record():
       y = x**2
       y_grad = autograd.grad(y, x, create_graph=True, retain_graph=True)[0]
       z = y_grad ** 2
   z.backward()
   print(z.grad)
   ```
   ```log
   MXNetError: [12:44:29] src/pass/gradient.cc:187: Operator 
_backward_power_scalar is non-differentiable because it didn't register 
FGradient attribute.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to