I don't understand what you want to do. Do you want to add extra dependencies in the grad? I never saw this asked before. What do you want to do?
There is way to stop the gradient from being propagated. There is way to alter the gradient to something else. Fred On Wed, Dec 21, 2016 at 6:58 AM, jbv <[email protected]> wrote: > Hi, > Is there a way to specify that while computing a gradient, theano.grad > should consider that a variable depends on other variables? For example by > attaching a grad method to a variable? > > The way I do that now is by defining a dummy Op that: > - returns a predefined value in perform > - has a grad method > > But I'm not satisfied with this method because it is cumbersome when I > want to modify the predefined value. Any suggestions? > > Thanks. > > -- > > --- > You received this message because you are subscribed to the Google Groups > "theano-users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > For more options, visit https://groups.google.com/d/optout. > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
