Theano use the instance of node to determine if they are the same, not the
name of the equivalence mathematic. Only during optimization we merge
duplicate computation. As the gradient is done before compilation, it work
on the not merged graph. This mean that those 2 nodes are completly
different for the grad:

subset = lookup_table[vector_of_indices]
subset2 = lookup_table[vector_of_indices]
cost =function_of(subset)
g=theano.grad(cost, subset2)

If you do the computation with subset and ask the gradient with subset2,
theano.grad don't know they are equivalent, so it raise an error that tell
that cost don't use subset2 in its computation.

Fred

On Tue, Aug 23, 2016 at 7:56 AM, Ozan Çağlayan <[email protected]> wrote:

> Hi,
>
> In http://deeplearning.net/software/theano/tutorial/faq_tutorial.html
>
> it says that:
>
> subset = lookup_table[vector_of_indices]
>
> From now on, use only ‘subset’. Do not call lookup_table[vector_of_indices]
> again. This causes problems with grad as this will create new variables.
>
> What exactly are the "problems" mentioned here? Should fetching a
> subtensor out of lookup table completely avoided?
>
> Thanks.
> --
> Ozan Çağlayan
> Research Assistant
> Galatasaray University - Computer Engineering Dept.
> http://www.ozancaglayan.com
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to