On Wed, Aug 31, 2016 at 1:47 PM, AA <[email protected]> wrote:

> Alright, so when calculating the derivative with respect to a shared
> variable that has been replicated, it won't consider appearances of
> the cloned shared variable when calculating the derivative, but the value
> of the cloned shared variable will change when updating the value of the
> original, correct?
>

yes


> If so, how to I reference the cloned shared variable, so that I can take
> derivative with respect to it as well?
>

Why you want that? You should not mix those graphs. I don't think it is a
good way to do what you want. What are you tring to do?

Fred


>
> Thanks for the help :)
> בתאריך יום רביעי, 31 באוגוסט 2016 בשעה 20:17:23 UTC+3, מאת nouiz:
>>
>> Hi,
>>
>> When we clone a shared variable, it will create a new object. So if you
>> compare them equal, it will be false.
>>
>> But they will represent the same value under the hood. This mean that if
>> you change the value of one of cloned or the original shared variable. This
>> change will be reflected in the other. Mostly, they have the same ptr to
>> the same data.
>>
>> It is not the same a making 2 shared variable with the same initial
>> value, as in that case, change of values to one ins't replicated to the
>> other.
>>
>> Fred
>>
>> On Wed, Aug 31, 2016 at 1:10 PM, AA <[email protected]> wrote:
>>
>>> According to the documentation, clone has a flag (share_inputs) which,
>>> if set to False, will clone the shared variables in the computational
>>> graph. But what exactly does that mean?
>>>
>>> If I understand correctly, the function returns a symbolic expression
>>> which is identical to the one given as a parameter, except some nodes in
>>> the graph have been replaced.
>>> First of all, how does one reference the other variables that are cloned
>>> if share_inputs is set to False? they don't seem to be part of what
>>> theano.clone returns.
>>>
>>> Second of all, what exactly does it mean to clone a shared variable,
>>> when the underlying storage is the same? how does that affect training?
>>>
>>> Thank you :)
>>>
>>> --
>>>
>>> ---
>>> You received this message because you are subscribed to the Google
>>> Groups "theano-users" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to [email protected].
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>>
>> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to