Also the error that I described, I got when I inputted conv1_out, not 
conv1_out[1] (that was just one of the things I tried to do to fix the 
problem). Sorry  for all the corrections to my original. 

On Thursday, March 9, 2017 at 12:26:10 PM UTC-5, Sergey Bokhnyak wrote:
>
> NOTE: The BN function is equivalent to T.nnet.bn.batch_normalization
>
> On Thursday, March 9, 2017 at 11:59:08 AM UTC-5, Sergey Bokhnyak wrote:
>>
>> Hello I am trying to apply a batch normalization after a convolution to 
>> my input, and am getting a dimension mismatch error. The error checks 
>> input[1].shape[3] != input[2].shape[3]. 
>>
>> After the 64 filter convolution my input is of the shape (1,64,112,112). 
>> My gamma, beta, mean, and std-dev are all (64,). I guess my question is am 
>> I doing something wrong? I can fix the problem by doing a 
>> input.dimshuffle(0,2,3,1) and putting the dimensions as the shape[3] and 
>> then do another dimshuffle(0,3,1,2) to bring it back to normal for the next 
>> convolution but that doesn't seem like the right solution and inefficient 
>> (definitely not what the creators of theano had in mind). In the 
>> documentation for batch_normalization function it says that the input is 
>> 'activations', so maybe I'm supposed to only send a part of the input? If 
>> anyone can help, I'd appreciate it very much.
>>
>>
>>
>>
>> conv1_out = conv2d(input=X,
>>  filters=conv1_W,
>>  filter_shape=(64,3,7,7),
>>  subsample=(2,2),
>>  border_mode=(3,3))
>> layer1_bn_out = T.nnet.relu(BN(inputs=conv1_out[1],
>>  gamma=bn1_conv1_gamma,
>>  beta=bn1_conv1_beta, 
>>  mean=bn1_conv1_mean,
>>  std=bn1_conv1_std))
>> # downsample of size 3 with stride of 2
>> current_output = pool.pool_2d(input=layer1_bn_out,
>>  ds=(3,3),
>>  st=(2,2),
>>  mode='max',
>>  ignore_border=False)
>>
>>
>>
>>
>> On a sidenote I noticed that theano takes in standard deviation whereas a 
>> lot of the other libraries use variance. Does that mean that if I am trying 
>> to load weights trained on another library, all I need to do is sqrt them 
>> before instantiating as a shared variable, correct?
>>
>>
>> Thanks.
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to