[GitHub] [incubator-mxnet] kshitij12345 commented on issue #14613: [MXNET-978] Higher order gradient support for some unary operators

2019-06-04 Thread GitBox
kshitij12345 commented on issue #14613: [MXNET-978] Higher order gradient 
support for some unary operators
URL: https://github.com/apache/incubator-mxnet/pull/14613#issuecomment-498775984
 
 
   @apeforest
   
   Could you update the `check_second_order_unary` as per
   
   
https://github.com/apache/incubator-mxnet/blob/37ce3b87268a8154f5c0ad97ce2522478038ee06/tests/python/unittest/test_higher_order_grad.py#L76-L103
   
   This covers check for gradient of the first input argument as well. Have 
tested a similar Pytorch Script which works ( code in PR #15120 ).
   
   However do note that for PR #15120 , 
   
https://github.com/apache/incubator-mxnet/blob/37ce3b87268a8154f5c0ad97ce2522478038ee06/tests/python/unittest/test_higher_order_grad.py#L102
   
   Assertion fails with `head_grads.grad.asnumpy()` being all `0's`. 
   
   Please check to see if it works for you.
   Thank You.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] kshitij12345 commented on issue #14613: [MXNET-978] Higher order gradient support for some unary operators

2019-06-01 Thread GitBox
kshitij12345 commented on issue #14613: [MXNET-978] Higher order gradient 
support for some unary operators
URL: https://github.com/apache/incubator-mxnet/pull/14613#issuecomment-497920495
 
 
   @apeforest As mentioned in the #14992 for `log` , I guess the check will 
fail with given script.
   ```Python
   from mxnet import nd, autograd
   import numpy
   import math
   grad_grad_op = lambda x: (-1/x**2)
   
   x = nd.random.normal(0,1,(3,3))
   x.attach_grad()
   with autograd.record():
 y = nd.log(x)
 y_grad = autograd.grad(y, x, head_grads= nd.ones_like(y) * 0.5, 
create_graph=True, retain_graph=True)[0]
   y_grad.backward(nd.ones_like(y_grad) * 0.6)
   
   numpy.testing.assert_allclose(x.grad.asnumpy() , ( grad_grad_op(x) * 0.5 * 
0.6).asnumpy(), rtol=1e-7, atol=1e-7)
   ```
   
   As the `grad` from upper layer is not preserved for `sin` and `cos`. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] [incubator-mxnet] kshitij12345 commented on issue #14613: [MXNET-978] Higher order gradient support for some unary operators

2019-05-30 Thread GitBox
kshitij12345 commented on issue #14613: [MXNET-978] Higher order gradient 
support for some unary operators
URL: https://github.com/apache/incubator-mxnet/pull/14613#issuecomment-497434499
 
 
   @apeforest Thank You. Only thing that bothered me was inconsistent use of 
Value Initialisation and List Initialisation. Should prefer List Initialisation
   
https://stackoverflow.com/questions/18222926/why-is-list-initialization-using-curly-braces-better-than-the-alternatives
   
   Otherwise LGTM.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services