Mostly, to make more clear what should be "inside" the loop and what
should be "outside". It makes it easier to know on which variables
gradients have to be accumulated for each step.

There would be other ways of achieving that, but that the implementation for 
now.

On Sun, Mar 26, 2017, Stephan Sahm wrote:
> Dear theano community,
> 
> starting to use scan more and more, I wondered why there is the 
> ``non-sequences`` parameter needed for scan to work?
> I mean instead of letting the accumlating function depend on these by 
> function parameters, I just could refer to them by reference as they are 
> unique.
> 
> For shared variables the library even implements the ``strict`` flag which 
> seems to ensure that everything is in fact passed via ``non-sequences`` 
> (see e.g. 
> http://deeplearning.net/software/theano/library/scan.html#using-shared-variables-the-strict-flag).
> Can someone explain why?
> 
> thanks in advance
> 
> -- 
> 
> --- 
> You received this message because you are subscribed to the Google Groups 
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.


-- 
Pascal

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to