So please tell us, if the topic is really that 'big', how many bits
does your hash-strengthening procedure safe, compared to the average
uniform random key generator with some basic checking?
1. As I told, I compute 64 bit hashes for the whole board, but since
the lowest 32 bits are safe to detect a repetition in the last 7 positions,
and I believe (again, I may be wrong) that the only possible superko at
some level is triple ko. Triple ko is also the only unanimously banned
superko using any rules. I only have to compare one 32 bit key to
detect triple ko or, comparing all the safe ones, this can be used
for ko, double and triple. I know its not very big because that happens
only following a 1 stone capture, but its free and forever.
2. More important: In a local search (I have not finished that. I would
like to have something similar to Thomas Wolf's GoTools, but linked to a
"high level search" in a particular way.) I have a faster 32 bit search
(only the asm kernel) which uses 32 bit keys (incomplete 64 bit keys).
Of course, if the 32 bit search is not safe, the "optimization" is not
used and the full key must be used. If the search is broken by timeout,
unlike in UCT, the result is useless and therefore (I presume) only few
"slow" searches will be completed. I can't tell how big this is, but it
has potential. As you say, searches with more than 16 empty cells are
feasible, but not really fast.
3. "..uniform random key generator with some basic checking?" It is a
set created by a RNG but the keys have been selected (about 3 of 5 do
not pass all the tests). It is more than just "basic checking" that's
all. After all its just a table of constants. I get the best set I can
and forget about it. I wanted to know what others have done in this area.
computer-go mailing list