This is my guess as to what the number of parameters actually is:
First layer: 128 * (5*5*36 + 19*19) (128 filters of size 5x5 on 36 layers
of input, position-dependent biases)
11 hidden layers: 11 * 128 * (3*3*128 + 19*19) (128 filters of size 3x3 on
128 layers of input, position-dependent biases)
Final layer: 2 *(3*3*128 + 19*19) (2 filters of size 3x3 on 128 layers of
input, position-dependent biases)

Total number of parameters: 2294738

Did I get that right?

I have the same question about the use of symmetry as Hugh.

Álvaro.


On Thu, Dec 25, 2014 at 8:49 PM, Hugh Perkins <hughperk...@gmail.com> wrote:

> Hi Aja,
>
> Couple of questions:
>
> 1. connectivity, number of parameters
>
> Just to check, each filter connects to all the feature maps below it,
> is that right?  I tried to check that by ball-park estimating number
> of parameters in that case, and comparing to the section paragraph in
> your section 4.  And that seems to support that hypothesis.  But
> actually my estimate is for some reason under-estimating the number of
> parameters, by about 20%:
>
> Estimated total number of parameters
> approx = 12 layers * 128 filters * 128 previous featuremaps * 3 * 3
> filtersize
> = 1.8 million
>
> But you say 2.3 million.  It's similar, so seems feature maps are
> fully connected to lower level feature maps, but I'm not sure where
> the extra 500,000 parameters should come from?
>
> 2. Symmetry
>
> Aja, you say in section 5.1 that adding symmetry does not modify the
> accuracy, neither higher or lower.  Since adding symmetry presumably
> reduces the number of weights, and therefore increases learning speed,
> why did you thus decide not to implement symmetry?
>
> Hugh
> _______________________________________________
> Computer-go mailing list
> Computer-go@computer-go.org
> http://computer-go.org/mailman/listinfo/computer-go
>
_______________________________________________
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go

Reply via email to