The sharedvariable is the right way. The OP should take it as input and
return a new value for it. At first, it should return new data. Not modify
inplace the old data. Doing it inplace can be done with more work. You need
to tell Theano about this. Keep this for when all work and of you see a
bottle neck in not doing it inplace.

Fred

Le 20 janv. 2017 12:51, "Kiuhnm Mnhuik" <kiuhnm2...@gmail.com> a écrit :

> Hi everyone,
>
> I defined an Op with a custom gradient. Is there a way to "leak out" some
> statistics about the computation of the gradient?
> For instance, let's say the gradient is computed iteratively and we'd like
> to return some statistics about the computation itself to monitor it.
> I created a shared variable and then I soon realized I have no way of
> updating it from inside the computation of the gradient :(
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to theano-users+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to