You can do the subtensor on the input and use that at the input of the
graph. This will make sure to remove all useless computation.

Ex

Start the graph from

Inp[I:I+1] instead of inp. Where I is symbolic.

Fred

Le ven. 31 mars 2017 05:06, Jan Kukačka <[email protected]> a écrit :

> Hi Pascal,
>
> thanks for the reply. It would indeed be equivalent, however, it seems to
> be terribly inefficient. Are there any intentions to implement this? It is
> an important feature for many neural network regularizers such as the
> tangent prop. or for contractive autoencoders.
>
> Best regards,
> Jan
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to