kpot commented on issue #8337: mx.autograd.grad works or fails depending on use 
of slices
URL: 
https://github.com/apache/incubator-mxnet/issues/8337#issuecomment-337794186
 
 
   @ZiyueHuang `a`s first dimension is 4, and slicing it like `a[0:4]` is 
absolutely valid and I didn't care about effectiveness here. But after you 
asked, I tried different expressions. I tried different sizes of `a` (for 
example, `a = mx.nd.array([ [ 1, 2, 3, 4] ])` and slicing it in the expression 
as `a[0]`). None of that has worked. I still see the same error every time I 
use slicing.
   
   `da_sym.list_arguments()` returns `['', 'var0']`. One must be the head 
gradient for the chain rule and another one is a placeholder the  for variable 
`a`. That's why I used such arguments. Which is which I determined 
experimentally, since both have different shapes, and I could easilly check the 
result of `executor.forward()` knowing derivative `db / da = 4 * a`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to