I think you can only evaluate static evaluation in the context of a search
and a tournament between programs. You could start with a simple 1-ply
search and play against gnugo. Strength in life and death or predicting pro
moves doesn't correlate with the ability to win games.
David
Many Faces does almost the same thing (handicap games with black only, 7
points per handicap stone, decreasing linearly with move number to move 90).
It looks like this change gained about half a rank on KGS.
David
-Original Message-
From: computer-go-boun...@computer-go.org
Many Faces has the same issue. The pruning and tuning that is required for
19x19 doesn't help 9x9. It seems that now the programs are strong enough
that 9x9 requires a good opening book, and I'd rather spend my time making
19x19 stronger.
David
Zen's algorithm is getting heavier and
My old MPI code had a scaling bug. Performance scaling (playouts per
second) was linear, but the strength did not scale well, and 64 cores was
weaker than 32 cores. I have a 16 core cluster of my own now (four 2.3 GHz
Q8200 quad core), and I discovered that the MPI code hangs when using MPICH2
To: computer-go
Subject: Re: [computer-go] Strong programs on cgos 19x19?
or the strong version of pachi.
Done.
Jean-loup
2010/2/16 David Fotland fotl...@smart-games.com
My old MPI code had a scaling bug. Performance scaling (playouts per
second) was linear, but the strength did
Is this 23 cores SMP working on the same tree, or four by 6-cores? I'm
running a cluster of four 4-core 2.3 GHz machines, using MPI to share the
core of the trees a few times a second.
The results between zen-1c, mfgo-16c and pachi-23c are interesting.
Zen wins about 60% against many
This would be very similar to the integration I do in Many Faces of Go. The
old engine provides a bias to move selection in the tree, but the old engine is
single threaded and only does a few hundred evaluations per second. I
typically get between 40 and 200 playouts through a node before Old
You can do some GPU experiments on Amazon AWS before you buy. 65 cents per hour
David
http://aws.amazon.com/ec2/instance-types/
G2
This family includes G2 instances intended for graphics and general purpose GPU
compute applications.
Features:
High Frequency Intel Xeon E5-2670 (Sandy Bridge)
For many faces, moves like, any Atari, fill a liberty in a losing semeai,
attack a group that is alive but doesn’t have two clear eyes yet.
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
Stefan Kaitschick
Sent: Saturday, January 10, 2015 1:13 AM
To:
Why don’t you make a dataset of the raw board positions, along with code to
convert to Clark and Storkey planes? The data will be smaller, people can
verify against Clark and Storkey, and they have the data to make their own
choices about preprocessing for network inputs.
David
Won’t hosting limit your usability? With cgos I can build and immediately test
on cgos on my development machine. With your service, how do I get my new
executable to run? If my engine uses a GPU or is a multinode cluster, how does
that run on your docker service?
David
From:
Converting back and forth from eval to winning probability is interesting, as
is combining the quick win threat and long term advantage evals.
David
-Original Message-
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On
Behalf Of Darren Cook
Sent: Wednesday, April 22,
I didn’t notice a difference. Like everyone else, once I had RAVE implemented
and added biases to the tree move selection, I found the UCT term made the
program weaker, so I removed it.
David
-Original Message-
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On
I can't travel to Europe for this tournament. The main issue for me is
arranging a local operator. I have no way to do that.
Regards,
David
-Original Message-
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On
Behalf Of Petr Baudis
Sent: Sunday, July 05, 2015 11:37
Congratulations to Aya. The commentary on the ManyFaces vs Aya game is very
interesting.
David
-Original Message-
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On
Behalf Of Petr Baudis
Sent: Wednesday, July 29, 2015 2:21 PM
To: computer-go@computer-go.org
In general this is beyond the state of the art of the strongest go programs.
You can’t score without determining the status of every group (live, dead,
seki), and you may need to identify required interior defensive moves that have
not been played.
David
From: Computer-go
Many Faces only has big nodes with all of the child statistics in one node,
along with the totals for the position. Like the right hand of your diagram,
but also with the 11/22 totals. There is no tree. All nodes are in a big
transposition table and there are no child or parent pointers. I
Many Faces uses 2200 for RAVE_EQUIV. I found that anything between 2000 and
3000 was about the same, and CLOP recommended 2200. 1000 was a little worse,
and 500 was much worse. In discussions with other programmers I heard numbers
between and 5000.
For parameter tuning I recommend
Yu Bin won his game against Dolbaram. The second official pro game is
happening now. 51wq.lianzhong.com/yidongwq
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
fotl...@smart-games.com
Sent: Saturday, November 14, 2015 1:35 AM
To: computer-go@computer-go.org
Many Faces of Go doesn’t use Remi’s playout policy and I don’t think Zen does
either. I don’t think Remi’s and Mogo’s are similar either, since they were in
some ways competing developments. The bias issue is very real, so as you add
knowledge to the playouts you have to be careful to add
Many Faces of Go has 2052 3x3 patterns. All have a empty point in the center.
One value is used for all the illegal patterns, so there are 2051 valid
patterns. I use Aja’s idea of including in the pattern the Atari status of
zero to four adjacent groups. That’s why it’s more than Álvaro’s
I've been helping them test their client and there are some issues. They
are working on fixing them. If you are having problems while testing,
please email me directly at fotl...@smart-games.com for details, and I can
save you some time or provide worarounds. I don't have email addresses for
-Original Message-
> From: Computer-go [ <mailto:computer-go-boun...@computer-go.org>
> mailto:computer-go-boun...@computer-go.org] On Behalf
> Of David Fotland
> Sent: 15 October 2015 06:51
> To: <mailto:computer-go@computer-go.org> computer-go@computer
In 2008 Many Faces was getting about 25k light playouts per second on 19x19.
Today it gets 2500 playouts per second on one thread of an i7-3770. I don’t
use a probability distribution in the UCT tree. I both count liberties and
maintain lists of liberty points, but all incrementally. In the
There is an easy way to enforce computational limits. Ask everyone to run on
an identical AWS instance. Nevertheless, I’m against identical hardware
tournaments except as a special rare exception.
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
David Doshay
You could what they do in bridge tournaments, and provide two sets of results
from the same tournament.
Hardware would be unrestricted for everyone
The Open result would include all participants, exactly as today.
A "single machine" result would only include participants that ran on a single
The non-mcts levels of Many Faces try to maximize score, with some bias toward
safety when ahead. The non-MCTS version uses dynamic komi to avoid giving up
points in the endgame, but this is not in the version 12 released engine.
David
From: Computer-go
Many Faces of Go is MC + expert system (plus local search, etc). The reason I
won the world championship in 2008 is because I implemented MCTS but
incorporated the old Many Faces expert system move generator and ranking. This
is pretty slow (a few hundred positions a second), so when the tree
No. Since MF's search is so highly pruned, and directly by the expert system
move generator, it scales poorly with computer power. If I went back to the
pure MFGO engine and added the modern ELO based pattern from Remi's approach, I
think it would be a couple of stones stronger, but still
ber 04, 2015 12:34 AM
> To: computer-go@computer-go.org
> Subject: Re: [Computer-go] re comments on Life and Death
>
> On 04.09.2015 07:25, David Fotland wrote:
> > group strength and connection information
>
> For this to work, group strength and connection status must
I forgot, I did publish a paper on Many Faces:
https://www.researchgate.net/publication/220174515_Static_Eye_Analysis_in_The_Many_Faces_of_Go
I'm not sure it's available online.
David
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On
> Behalf Of
Probably everyone does something different for Dynamic komi. There have a few
publications.
In MFGO, the shipping version 12 doesn’t use dynamic komi, but the KGS version
has it, and it's probably worth about half a stone. My algorithm tries to keep
the win rate between 55% and 60% when
Many Faces of Go gives reasons for its moves after fact. It reasons about the
position using go proverbs, life and death analysis, group strength and
connection information, etc. If you have a copy, you can ask it to explain its
reasons for making a move. There were far more than a few
I agree that group strength can't be a single number. That's why I classify
groups instead. Each classification is treated differently when estimating
territory, when generating candidate moves, etc. The territory counts depend
on the strength of the nearby groups.
Monte Carlo has a big
I never tried to optimize stopping, so my stopping rule is very conservative.
Many Faces stops at twice the number of points on the board, or if the mercy
rule triggers. The mercy rule requires one side to have many more stones on
the board than the other (at least 1/3 of the number of points
Yes, in the old engine, I roll everything up into a single number, with a
resolution of 1/100th of a point (only so the total score would fit in a 16 bit
integer on the 16 bit machine I used for development in 1982).
I would say rather, that expert systems are dead in Go because many smart
No, simple radiation is not the best, although some programs (including mine)
started with something like this. I think the best approach was Reiss' Go4++,
where territory was modelled using connectivity. If a new stone can be
connected to a living group of the same color, then this point
I don’t share or take code from other programs because Many Faces of Go is
commercial. Many other programs have licenses that are not compatible with
commercial use, so I'm careful not to even look at their source code. We share
ideas all the time, through publications, informal conversations
e playing
> > strength of the old program and get a fair comparison it should again
> > run on an old machine while the modern go-programs use today's hardware.
> > - Michael.
>
> I discussed your point in depth with David Fotland (father of MFoG).
> Back in 1998, MFoG had
Attempting to maximize the score is not compatible with being a strong engine.
If you want a dan level engine it is maximizing win-probability.
David
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
> Darren Cook
> Sent: Tuesday,
1 kyu on KGS with no search is pretty impressive. Perhaps Darkforest2 is too
slow.
David
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of Andy
Sent: Monday, November 23, 2015 9:48 AM
To: computer-go
Subject: Re: [Computer-go] Facebook Go AI
As of about an
gt; Hi David,
>
> I am not happy with my IDE on linux too. You might give Visual Studio on
> linux a try:
>
> https://www.visualstudio.com/de-de/products/code-vs.aspx
>
> It seems to be free...
>
> Detlef
>
> Am 05.02.2016 um 07:13 schrieb David Fotland:
> &
I'm not using it. Many Faces is written in c, (gui in C++ with MFC). I ported
caffe to windows and I'm calling caffelib directly from mfgo. I'm not training
a net yet, so I haven’t decided what to do. Most likely I will create the
input database using c++ code in many faces, and train using
access to
halting your machine if you are deep in the guts. ;)
s.
On Tue, Feb 2, 2016 at 10:25 AM, David Fotland <fotl...@smart-games.com> wrote:
Detlef, Hiroshi, Hideki, and others,
I have caffelib integrated with Many Faces so I can evaluate a DNN. Thank you
very much Detle
Google’s breakthrough is just as impactful as the invention of MCTS.
Congratulations to the team. It’s a huge leap for computer go, but more
importantly it shows that DNN can be applied to many other difficult problems.
I just added an answer. I don’t think anyone will try to exactly
Robert, please consider some of this as the difference between math and
engineering. Math desires rigor. Engineering desires working solutions. When
an engineering solution is being described, you shouldn't expect the same level
of rigor as in a mathematical proof. Often all we can say is
Amazon uses deep neural nets in many, many areas. There is some overlap with
the kind of nets used in AlphaGo. I passed a link to the paper on to one of
our researchers and he found it very interesting. DNN works very well when
there is a lot of labelled data to learn from. It can be useful
Detlef, Hiroshi, Hideki, and others,
I have caffelib integrated with Many Faces so I can evaluate a DNN. Thank you
very much Detlef for sample code to set up the input layer. Building caffe on
windows is painful. If anyone else is doing it and gets stuck I might be able
to help.
What
buntu update updated the graphics driver: I had 2
> times in the last year to reinstall cuda (a little ugly, as the graphic
> driver did not work after the update and you had to boot into command
> line mode).
>
> Detlef
>
> Am 02.02.2016 um 19:25 schrieb David Fotland:
> > Detlef
] Move evalution by expected
> value, as product of expected winrate and expected points?
>
> My 1.5 cent:
>
> David Fotland has a nice score-estimator in his (old) ManyFaces bot.
> The score estimator is still from the days before the Monte Carlo
> version.
>
> Perhaps
t; Also, even quite big nets probably can be run on modest GPUs reasonably
> well (within memory bounds). It's the training where the size really
> hurts.
>
> On Tue, Mar 1, 2016 at 6:19 PM, Petr Baudis <pa...@ucw.cz> wrote:
> > On Tue, Mar 01, 2016 at 09:14:39AM -0800, Da
Yes, I think the programs will have similar biases. In this game Sedol had
some groups that were alive, but needed correct responses to stay alive. Even
though the pro's stones won’t die, the playouts sometimes manage to kill them.
This makes the program think it is more ahead than it
I predicted Sedol would be shocked. I'm still routing for Sedol. From
Scientific American interview...
Schaeffer and Fotland still predict Sedol will win the match. “I think the pro
will win,” Fotland says, “But I think the pro will be shocked at how strong the
program is.”
>
> P.
Smart-games.com is getting a big increase in traffic, so there is certainly
more interest in the game now. I hope it holds up for the long term.
David
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
Dmitry Kamenetsky
Sent: Saturday, March 12, 2016 2:18 PM
To:
Tremendous games by AlphaGo. Congratulations!
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
Lukas van de Wiel
Sent: Saturday, March 12, 2016 12:14 AM
To: computer-go@computer-go.org
Subject: [Computer-go] Congratulations to AlphaGo
Whoa, what a fight! Well
He was already in Byo-yomi, so perhaps he didn’t have an accurate count. This
might explain why he looked upset at move 175. He might have realized his
mistake.
David
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf
> Of Darren Cook
>
There are 12 programs here that have deep neural nets. 2 were not qualified
for the second day, and six of them made the final 8. Many Faces has very
basic DNN support, but it’s turned off because it isn’t making the program
stronger yet. Only Dolburam and Many Faces don’t have DNN in the
I have sgf’s of the Many Faces’ games, but I finished 8th. I don’t have the
top games.
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
Pawel Morawiecki
Sent: Sunday, March 20, 2016 1:57 AM
To: computer-go@computer-go.org
Subject: Re: [Computer-go] UEC cup 2nd day
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
> Darren Cook
> Sent: Wednesday, March 23, 2016 5:19 AM
> To: computer-go@computer-go.org
> Subject: *SPAM* Re: [Computer-go] UEC cup 2nd day
>
> David Fotland
gt; * Desktop: CPU i7-4770 (Haswell), 3.5 GHz , DRAM - 16 GB; GPU K20.
> >>> * Ubuntu 12.04; gcc 4.7.3; MKL 11.1.
> >>>
> >>> Test:: imagenet, 100 train iteration (batch = 256).
> >>>
> >>> * GPU: time= 260 sec / memory = 0.8 GB
I got the basics of Machine learning (including sample neural nets) from Andrew
Ng's course course, two or three years ago. I highly recommend it. Lots of
practical advice. The rest came from reading papers and probably some on-line
searches. Amazon's Computer Vision team uses deep neural
Very interesting, but it should also mention Aya.
I'm working on this as well, but I haven’t bought any hardware yet. My goal is
not to get 7 dan on expensive hardware, but to get as much strength as I can on
standard PC hardware. I'll be looking at much smaller nets, that don’t need a
GPU
Many Faces thought alpha go was ahead most of the game. It looked to me like
the turning point was when Alphago cut in the center then gave up the two
cutting stones for gains on both sides (but not so strong…).
Congratulations Aja!
I watched it at Google in Mountain View with about
a factor of 32 compression right there, and you might be using
constant planes for some inputs, and if the output is a move it fits in 9
bits...
Álvaro.
On Wed, Apr 27, 2016 at 12:55 AM, David Fotland <fotl...@smart-games.com> wrote:
I have my deep neural net training setup w
training
You can also use hdf5 format, which has transparent compression as well as
coffee support.
Josef
Dne st 27. 4. 2016 18:06 uživatel Gian-Carlo Pascutto <g...@sjeng.org> napsal:
On 27-04-16 17:45, David Fotland wrote:
> I’d rather just buy another drive than spend ti
I have my deep neural net training setup working, and it's working so well I
want to share. I already had Caffe running on my desktop machine (4 core
i7) without a GPU, with inputs similar to AlphaGo generated by Many Faces
into an LMDB database. I trained a few small nets for a day each to get
The alphaGo network is detailed in their paper. They have about 50 binary
inputs, one layer of 5x5 convolutional filters, and about 12 layers of 3x3
convolutional filters. Detlef’s net is specified in the prototxt file he
published here. It’s wider and deeper, but with fewer inputs.
The
I don’t expect AlphaGo will be available at any price, but I expect similar
strength programs will be running on high end PCs in a few years. The AlphaGo
team has done an outstanding job of exploring the solution space and showing us
the way. Others can now tweak and optimize and find more
https://www.reddit.com/r/pokemongo/comments/4tez82/how_pokemon_really_play_go/
Although it looks like they are actually playing Go Moku.
___
Computer-go mailing list
Computer-go@computer-go.org
http://computer-go.org/mailman/listinfo/computer-go
Correction on ManyFaces hardware. Running on a 4-core i7-4790 3.6 GHz, without
a GPU, using a deep neural net (that I trained on KGS games).
David
From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
Nick Wedd
Sent: Saturday, July 16, 2016 8:21 AM
To:
I train using approximately the same training set as AlphaGo, but so far
without the augmentation with rotations and reflection. My target is about
55.5%, since that's what Alphago got on their training set without
reinforcement learning.
I find I need 5x5 in the first layer, at least 12
Many faces does most of what has been mentioned. In addition, rather than stop
search when it is impossible for another move to be chosen, I stop earlier,
when it is unlikely for another move to become best. When far ahead, I stop a
little earlier. That preserves some time in case there is a
Congratulations to Zen for playing so well against a strong pro. It won't be
long until anyone can get a pro strength go program that runs on their ordinary
PC.
David
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
> Hiroshi Yamashita
Remi has something: https://www.remi-coulom.fr/kifu-snap/
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
> Hideki Kato
> Sent: Thursday, November 24, 2016 4:17 PM
> To: computer-go@computer-go.org
> Subject: [Computer-go] Auto Go game
Amazon p2.16xlarge instance gets you 64 cores (Xeon E5-2686v4) and 16 K80 GPUs
for $14.40 per hour. Not bad if you just want to run it during a competition.
David
> -Original Message-
> From: Computer-go [mailto:computer-go-boun...@computer-go.org] On Behalf Of
> Cameron Browne
> Sent:
Because you test it both ways, and one wins more games. Many things about the
playout policy are mysterious and can only be tested to see if they make play
stronger. Often the results of testing are counterintuitive. I'd guess only
about a quarter of the things I tried in Many Faces made the
Alpha's publication is pretty clear on how they did it. Now that their research
has shown the way, other competent teams with similar compute resources should
be able to duplicate their work. It has been almost a year, which is enough
time.
David
> -Original Message-
> From:
I think the character set property just refers to the contents of comments and
similar fields. The sgf format itself is entirely in the common characters in
UTF-8 and US-ASCII. There is no need to assume a character set before the
property. If you find the character set property in the root
o-boun...@computer-go.org] On Behalf Of
> Ray Tayek
> Sent: Friday, May 05, 2017 10:44 PM
> To: computer-go@computer-go.org
> Subject: Re: [Computer-go] software like: http://ps.waltheri.net/
>
> On 5/5/2017 5:38 PM, David Fotland wrote:
> > Many Faces of Go Fuseki tutor can
Many Faces of Go Fuseki tutor can do this, but I'd have to help if you want to
start from an empty database. That's how I generate the tutor. You can add sgf
files to the existing tutor pretty easily.
David
> -Original Message-
> From: Computer-go
401 - 480 of 480 matches
Mail list logo