Hi, Is there a way to specify that while computing a gradient, theano.grad should consider that a variable depends on other variables? For example by attaching a grad method to a variable?
The way I do that now is by defining a dummy Op that: - returns a predefined value in perform - has a grad method But I'm not satisfied with this method because it is cumbersome when I want to modify the predefined value. Any suggestions? Thanks. -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
