I have used TensorFlow to train a CNN that predicts the next move, with a
similar architecture to what others have used (1 layers of 5x5 convolutions
followed by 10 more layers of 3x3 convolutions, with 192 hidden units per
layer and ReLU activation functions) but with much simpler inputs. I found
Thanks for the very interesting replies, David, and Remi.
No-one is using TensorFlow, then? Any reason not to? (I'm just curious
because there looks to be a good Udacity DNN course
(https://www.udacity.com/course/deep-learning--ud730), which I was
considering, but it is using TensorFlow.)
Remi
Which one is Remi's?
On Thu, Mar 24, 2016 at 1:09 AM, David Fotland
wrote:
> There was one program (Shrike) that had a dnn without search. It didn’t
> finish in the top 8. Zen and Crazystone have custom DNN implementations.
> Dark Forest uses Torch. The rest used
Hi,
This UEC Cup was really very exciting.
I had started to code my own home-made deep learning library in November, after
finishing my Japanese mahjong engine. I was working quietly on it when the
Alphago paper was published. Then I felt that I had to urgently get something
to work before
There was one program (Shrike) that had a dnn without search. It didn’t finish
in the top 8. Zen and Crazystone have custom DNN implementations. Dark Forest
uses Torch. The rest used Caffe.
Remi's implementation is unusual and interesting. I'll let him share it if he
wants to.
David
>