It is always possible to use scan for that, in fact that is what theano.gradient.jacobian does, by looping over each row of the Jacobian. See https://github.com/Theano/Theano/blob/4a8fed96fb8d5faaff4441d19c8aca33350e3db5/theano/gradient.py#L1769
Some optimizations may push parts of the computation out of the loop, to avoid recomputing the same thing, or just vectorize them, but unfortunately this would still be a performance hit compared to a more optimized expression of the same computation. On Friday, June 16, 2017 at 12:35:45 PM UTC-4, Juan Camilo Gamboa Higuera wrote: > > Hi, > > Is it possible to do this; e.g. the way autograd does it? > > > https://github.com/HIPS/autograd/blob/a055b4282e4e9ac91843463aea04bc7cdee8716f/autograd/convenience_wrappers.py#L30 > ) > > > Thanks! > > --Jjuan Camilo > > -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
