+theano-users,BCC:theano-dev Unless you're including the gradient in the outputs or updates of your Theano function it won't be keeping intermediate results around. But it sounds like you are taking gradients but would not like to backprop through this additional computation.
I have used the gradient blocking identity op trick you describe before with success (consider_constant to grad() should do the same I think). I'd debugprint or pydotprint/d3viz your gradient's graph before and after adding this branch with the gradient blocker and verify that it's having the effect you think it is. Compile time when you're altering the graph is normal, subsequent compiles of the same should benefit from caching. On Sat, Dec 31, 2016, 9:54 AM <[email protected]> wrote: > I wish to exclude a branch of a theano graph from backpropagation so that > only a forward pass is performed and there is no need to store intermediate > results. This is the output of a 3D filter used in a deep CNN, and each > filter has this computation performed. The computation output is combined > with the filter output to adjust its threshold and so is an automatic gain > control. Currently adding this side-branch computation comes at an enormous > 6x slow down in the graph execution and also much longer compilation time > though the branch is very simple. Also this is not intended to be a > trainable thing and I specifically want to exclude backpropagation from > using this branch to feed error through to the filter for training. > > I tried making a Python custom op that has the op.grad method return zeros > and also DisconnectedType and calling this dummy op on the output of the > branch and at the input of the branch, but this has no effect on either > compilation or execution time. > > Anyone have any ideas ? > > Thanks in advance. > > Brendan Ruff > > -- > > --- > You received this message because you are subscribed to the Google Groups > "theano-dev" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to [email protected]. > For more options, visit https://groups.google.com/d/optout. > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
