If the network is too selective, the cost function used to generate it
doesn't penalize extreme predictions sufficiently? It was generated using
quadratic cost when it should have been using cross-entropy cost?
On Mon, May 23, 2016 at 12:08 AM Álvaro Begué
wrote:
>
Disclaimer: I haven't actually implemented MCTS with NNs, but I have played
around with both techniques.
Would it make sense to artificially scale down the values before the
SoftMax is applied, so the probability distribution is not as concentrated,
and unlikely moves are not penalized as much?
Hi,
Thanks for using Crazy Stone.
I tried changes during the week, but nothing worked. So the version that played
the game was almost identical to the commercial version.
The search did not anticipate Black E8 after B3. It seems the NN makes the
search too selective. I will investigate more.
Hi,
> It's fun to hear the pro making comments as she goes. I had hoped for a
> better game, though.
> Any comments from the CS camp?
>
> I'm not from CrazyStone Team but a happy user of CS Deep Learning.
I analyzed the game (30 000 playouts per move) with the version
commercially available and
I just saw the video here: https://www.youtube.com/watch?v=ZdrV2H5zIOM
It's fun to hear the pro making comments as she goes. I had hoped for a
better game, though.
Any comments from the CS camp?
Thanks,
Álvaro.
On Mon, May 16, 2016 at 3:58 AM, Xavier Combelle
Hello, I would like to invite all you go bot developers to my new go server.
http://goratingserver.appspot.com
Here are a few reasons why its bot friendly. Its a simple (and very pretty)
auto match server, with no handicaps, and only such pairings that there is
a reasonable chance for both