By the way, does anybody know of any nifty tools or heuristics for efficient probabilistic multi-parameter optimization? In other words, like multi-dimensional optimization, except instead of your function returning a deterministic value, it returns the result of a Bernoulli trial, and the heuristic uses those trial results to converge as rapidly as possible to parameter values that roughly maximize the success probability.
I recommend evolutionary algorithms because they are robust on noise and don't require a quadratic or linear model for the function they optimize. I would go as simple as a ES(1+1) algorithm (a glorified name for a simple hill climber that probes randomly for its next step). I would also use restarts: run it once until no more improvement is apparent, then run it again and again (restarts) a few times (5-10) and take the overall optimum found. You'd be surprised how far you can get with this method!
Adrian _______________________________________________ computer-go mailing list [email protected] http://www.computer-go.org/mailman/listinfo/computer-go/
