yes, that was the question! thank you!

El miércoles, 9 de noviembre de 2016, 15:26:32 (UTC+1), nouiz escribió:
>
> I'm not sure I understand the question correctly. But I think the answer 
> is yes.
>
> Hidden layer in an MLP are fully connected layers.
>
> Fred
>
> On Wed, Nov 9, 2016 at 7:26 AM, Beatriz G. <[email protected] 
> <javascript:>> wrote:
>
>> Hi, 
>>
>> Could be used a independant hidden layer of mlp as a fully connected 
>> layer?
>>
>> regards.
>>
>>
>> El viernes, 13 de marzo de 2015, 18:25:38 (UTC+1), Pascal Lamblin 
>> escribió:
>>>
>>> Hi, 
>>>
>>> On Fri, Mar 13, 2015, Orry Messer wrote: 
>>> > I don't quite understand the structure of the network after the second 
>>> > convolution/pooling layer and just before the hidden layer. 
>>> > I think what is confusing me is the batch aspect of it. 
>>> > With a batch size of 500, the hidden layer will have 500 units. This 
>>> much 
>>> > I'm ok with. 
>>>
>>> That's not right... by coincidence, the batch size in 500, which is 
>>> also the number of output units of that layer. Each of these units will 
>>> compute a different value for each of the examples in the batch, so the 
>>> output of that layer will be (batch_size, n_outputs) or (500, 500). 
>>>
>>> > But what is the input to this hidden layer? In the comments it says 
>>> that 
>>> > the hidden layer operates on matrices of size (batch_size, 
>>> pixel_size), 
>>> > which in the case of this code 
>>> > is (500, 800). 
>>>
>>> That's correct. 
>>>
>>> > Does this then mean that each hidden unit has 500*800 inputs? 
>>>
>>> Not really. Each hidden unit has 800 scalar inputs. Each of these inputs 
>>> will take a different value for each example of the minibatch, and the 
>>> neuron will also output one value for each example. 
>>>
>>> > And since there are 500 hidden units, does this then mean that there 
>>> > are a total of 500*(500*800) connections and as many weights to tune 
>>> > in this layer alone? 
>>>
>>> No, the weights are the same for all the examples of the minibatch. 
>>>
>>> -- 
>>> Pascal 
>>>
>> -- 
>>
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "theano-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] <javascript:>.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to