ThomasDelteil commented on issue #14340: [bug] Bug in Gradient flow with backward(retain_graph=True) and split() URL: https://github.com/apache/incubator-mxnet/issues/14340#issuecomment-483453983 Yes your understanding is correct On Tue, Apr 16, 2019, 01:14 Vikas89 <[email protected]> wrote: > @ThomasDelteil <https://github.com/ThomasDelteil> > I tried running code on my mac , > > import mxnet as mx > from mxnet import gluon, autograd, nd > > for i in range(1): > ctx = mx.cpu() > data = mx.nd.ones((1,10,), ctx=ctx) > param = mx.nd.ones((1,10,), ctx=ctx) > data.attach_grad(grad_req='write') > param.attach_grad(grad_req='write') > with autograd.record(): > z = data*param > # print(z.shape()) > z1, z2 = z.split(2,1) > #z1, z2 = z[:,:5], z[:,5:] > z1.backward(retain_graph=True) > print(data.grad.asnumpy()) > z2.backward(retain_graph=True) > print(data.grad.asnumpy()) > print() > break > > The results I get is : > > [[1. 1. 1. 1. 1. 0. 0. 0. 0. 0.]] > [[ 0.0000000e+00 2.0000000e+00 2.5715471e+36 -4.6577453e-10 > 1.6986417e-14 3.6433760e-44 4.8807685e+13 6.0397474e+26 > 4.3427447e-20 1.4012985e-45]] > > I guess, the expected output of print(data.grad.asnumpy()) is : > [[0. 0. 0. 0. 0. 1. 1. 1. 1. 1.]] > > Can you confirm my understanding is correct ? > > I also verified that split_v2 is giving correct results ? > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <https://github.com/apache/incubator-mxnet/issues/14340#issuecomment-483452493>, > or mute the thread > <https://github.com/notifications/unsubscribe-auth/ADi006X2XmIdyi4bsdFM_XxFFmZjhOZyks5vhQfAgaJpZM4bf3pq> > . >
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
