On 2/13/07, Jacques Basaldúa <[EMAIL PROTECTED]> wrote:
> And what distinguishes database look up from things like transposition
> table look up? Why wouldn't one use database look up during tree
The interest in rotation/mirror. In database search, what is good for
a position is good for the mirror/rotation. In tree search, rotation
of the full board does not happen, and if it does (in a very simple
position) its pointless to detect because it is not the same position.
Ok, I probably misunderstood. I thought you were referring to you
strengthened hash keys (attempting to show that this is something
'big' in contrast to the 'small' improvements one might get from
efficiently generating symmetrical hashes).
> I'm speculating here because I seem to be missing some context, but
> AFAICS almost any Go-specific application will have more legal moves
> hash keys than bits for the hash.
Of course. That's because full global search is intractable.
No it's not. If you'd try 5x5 Go with 32-bit hashkeys you'd run into
exactly the same problem...
And my favorite:
> if the hash calculation is a major factor in the performance of
> your program, you should seriously consider adding some knowledge.
We have seen this when talking about programming language. Each time
someone cares about details and builds programs that are optimized
from the day of their birth, they are:
a. Dirty and hard to maintain. (IMO only patched programs are
hard to maintain and they are patched because important issues
were thought too late.)
b. Caring about "stupid" issues == ignoring Big Algorithmic
Improvements. (IMO If you don't really care about details you
don't really control the "big picture". And more important, you
only have to do it once, if you do it well. That will give you a
lot of time for thinking about algorithms.)
Don't misunderstand me, I *do* care about the details, why else would
I be in this discussion? I only found your statement "Using an
unnecessarily big hash key slows down the program." a bit out of
proportion in the context of strengthening hashkeys.
To make this point more clear let's put it this way:
1) You are probably using something like 8-byte hashkeys.
2) You want to make them stronger/safer for a *specific* task, using
some clever, non-random hash key generation scheme, right?
3) The no-brainer solution guaranteed to provide extra safety is to
simply to add bits.
4) Now, if your scheme for improving the strength of hashkeys would be
something 'big' I would expect it to safe me at least something like a
byte (with equivalent security level), but on most hardware you'd
probably want to safe at least 32-bits (in order to get a noticeable
speed difference). However, I'd even be willing to grant you that,in
some exceptional cases, only a one bit improvement might already be
So please tell us, if the topic is really that 'big', how many bits
does your hash-strengthening procedure safe, compared to the average
uniform random key generator with some basic checking?
computer-go mailing list