Hmm. I never receive an error raised from theano.grad but I have a usage
like the below which gets two embeddings of the same vector_of_indices but
in reverse. Does this cause any problem during backprop? Or generally, is
it OK to use the same tensor several times in the graph either in full form
or as a subtensor of it?


forw_embs = emb[x.flatten()]
x_rev = x[::-1]
back_embs = emb[x_rev.flatten()]

bidir_emb = tensor.concatenate([forw_embs, back_embs] , ...)
..
..
..

grads = theano.grad(cost, [..., ..., ..., emb])

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to