Hi All,

In one of my neural net models, I have shared weights at several different 
parts of the network. For example, let's say weights are completely shared 
at layer 1 and layer 3.
Will the gradient update from  theano.grad sum the gradients of the shared 
weights? Or will it simply take the gradient update from layer 1?

Any insight appreciated.

Thanks,
John

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to