Some photos here.
http://itpro.nikkeibp.co.jp/atcl/column/15/061500148/051900060/
(in Japanese). The performance/watt is about 13 times better,
a photo in the article shows.
Hideki
Petr Baudis: <20160519105443.go22...@machine.or.cz>:
> Hi,
>
> it seems that Google in fact used TPUs for
The alphaGo network is detailed in their paper. They have about 50 binary
inputs, one layer of 5x5 convolutional filters, and about 12 layers of 3x3
convolutional filters. Detlef’s net is specified in the prototxt file he
published here. It’s wider and deeper, but with fewer inputs.
The
Hey there,
just like everyone else I’m currently looking into neural networks for my
go program. ;) Apart from the AlphaGo paper where I can I find information
about network architecture? There’s the network from April 2015 from Detlef