Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread Brian Sheppard via Computer-go
I think it uses the champion network. That is, the training periodically generates a candidate, and there is a playoff against the current champion. If the candidate wins by more than 55% then a new champion is declared. Keeping a champion is an important mechanism, I believe. That creates

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread uurtamo .
I ask because there are (nearly) bus-speed networks that could make multiple evaluation quick, especially if the various versions didn't differ by more than a fixed fraction of nodes. s. On Oct 25, 2017 3:03 PM, uurt...@gmail.com wrote: Does the self-play step use the most recent network for

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread uurtamo .
Does the self-play step use the most recent network for each move? On Oct 25, 2017 2:23 PM, "Gian-Carlo Pascutto" wrote: > On 25-10-17 17:57, Xavier Combelle wrote: > > Is there some way to distribute learning of a neural network ? > > Learning as in training the DCNN, not

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread Xavier Combelle
Nice to know. I wrongly believe that training such a big neural network would need considerable hardware. Le 25/10/2017 à 19:54, Álvaro Begué a écrit : > There are ways to do it, but it might be messy. However, the vast > majority of the computational effort will be in playing games to > generate

Re: [Computer-go] AlphaGo Zero SGF - Free Use or Copyright?

2017-10-25 Thread Darren Cook
> What do you want evaluate the software for ? corner cases which never > have happen in a real game ? If the purpose of this mailing list is a community to work out how to make a 19x19 go program that can beat any human, then AlphaGo has finished the job, and we can shut it down. But this list

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread Shawn Ligocki
My guess is that they want to distribute playing millions of self-play games. Then the learning would be comparatively much faster. Is that right? On Wed, Oct 25, 2017 at 11:57 AM, Xavier Combelle wrote: > Is there some way to distribute learning of a neural network

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread Álvaro Begué
There are ways to do it, but it might be messy. However, the vast majority of the computational effort will be in playing games to generate a training database, and that part is trivial to distribute. Testing if the new version is better than the old version is also very easy to distribute.

Re: [Computer-go] Zero is weaker than Master!?

2017-10-25 Thread Xavier Combelle
As I understand the paper they directly created alphago zero with a 40 block setup. They just made a reduced 20 block setup to compare on kifu prediction (as far as I searched in the paper, it is the only place where they mention the 20 block setup) They specifically mention comparing several

Re: [Computer-go] AlphaGo Zero

2017-10-25 Thread Gian-Carlo Pascutto
On 25-10-17 16:00, Petr Baudis wrote: >> The original paper has the value they used. But this likely needs tuning. I >> would tune with a supervised network to get started, but you need games for >> that. Does it even matter much early on? The network is random :) > > The network actually

Re: [Computer-go] AlphaGo Zero SGF - Free Use or Copyright?

2017-10-25 Thread Xavier Combelle
Le 24/10/2017 à 22:41, Robert Jasiek a écrit : > On 24.10.2017 20:19, Xavier Combelle wrote: >> totally unrelated > > No, because a) software must also be evaluated and can by go theory and What do you want evaluate the software for ? corner cases which never have happen in a real game ? The

Re: [Computer-go] Zero is weaker than Master!?

2017-10-25 Thread Xavier Combelle
I understand better Le 25/10/2017 à 04:28, Hideki Kato a écrit : > Are you thinking the 1st instance could reach Master level > if giving more training days? > > I don't think so. The performance would be stopping > improving at 3 days. If not, why they built the 2nd > instance? > > Best, >

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread Xavier Combelle
Is there some way to distribute learning of a neural network ? Le 25/10/2017 à 05:43, Andy a écrit : > Gian-Carlo, I didn't realize at first that you were planning to create > a crowd-sourced project. I hope this project can get off the ground > and running! > > I'll look into installing this

Re: [Computer-go] AlphaGo Zero

2017-10-25 Thread Petr Baudis
On Fri, Oct 20, 2017 at 08:02:02PM +, Gian-Carlo Pascutto wrote: > On Fri, Oct 20, 2017, 21:48 Petr Baudis wrote: > > > Few open questions I currently have, comments welcome: > > > > - there is no input representing the number of captures; is this > > information

[Computer-go] Collision between e-manners and shoelaces...

2017-10-25 Thread patrick.bardou via Computer-go
Hi Pierce from Caltech, Would an Aspergers typically try to lift himself up by pulling on his shoelaces ? I think you just mistaken me for my fellowcontryman Xavier Combelle and was not replying to my post: http://computer-go.org/pipermail/computer-go/2017-October/010338.html I posted it

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread Gian-Carlo Pascutto
On 25-10-17 05:43, Andy wrote: > Gian-Carlo, I didn't realize at first that you were planning to create a > crowd-sourced project. I hope this project can get off the ground and > running! > > I'll look into installing this but I always find it hard to get all the > tool chain stuff going. I

Re: [Computer-go] Source code (Was: Reducing network size? (Was: AlphaGo Zero))

2017-10-25 Thread fotland
Sadly, this is GPL v3, so it's not safe for me to look at it. David PS even though Robert's posts are slightly off topic for the AlphaGo discussion, I respect that he has thought far more deeply than I have about go, and I support his inclusion in the list. -Original Message- From: