That's right. Theano will take care of the gradients for you, i.e. it will 
correctly calculate d loss/d W even if W is used in multiple places.

You may need to do self.W.dimshuffle(0, 1, 3, 2) rather than np.transpose. 
I can never remember what numpy functionality can be applied to theano 
tensors.

On Tuesday, July 12, 2016 at 10:35:44 AM UTC-7, André Ribeiro wrote:
>
> Hi Jesse,
>
> Thank you for the reply. I guess what you are then saying is when I am 
> defining the convolutional layer to have the two consecutive filters to use 
> the same weights (as in the code below), right? Will theano be able to 
> properly update this (I mean it would need to average the gradient 
> calculated at the two layers)?
>
>
>         # convolve 2d with separable 1d filters
>         conv_outx = conv2d(
>             input=input,
>             filters=self.W,
>             filter_shape=filter_shape,
>             image_shape=image_shape
>         )
>         conv_outxy = conv2d(
>             input=conv_outx,
>             filters=np.transpose(self.W,(0,1,3,2)),
>             filter_shape=filter_shape,
>             image_shape=image_shape
>         )
>
>
>
> terça-feira, 12 de Julho de 2016 às 17:56:53 UTC+1, Jesse Livezey escreveu:
>>
>> This should be as simple as creating just one shared variable to train 
>> and then using it in different places in the network.
>>
>> On Tuesday, July 12, 2016 at 9:48:28 AM UTC-7, André Ribeiro wrote:
>>>
>>> Hi,
>>>
>>> I am trying to create a CNN that share weights between 2 consecutive 
>>> layers. This is particularly interesting if we are looking into separable 
>>> filters.
>>> For example:
>>> If we want to approximate a 2d gaussian filter using a CNN, we could 
>>> potentially just have to learn a 1 layer 1d convolutional network (with k 
>>> elements) and apply it first in the x direction and then in the y 
>>> direction, instead of a 1 layer 2d convolutional network (with k^2 
>>> elements).
>>>
>>> Any help here?
>>>
>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to