On Dec 21, 2007 10:03 AM, Álvaro Begué <[EMAIL PROTECTED]> wrote: > I am sure MM is a perfectly good algorithm for this purpose, but it has > the serious down side that I don't understand it. :) I do understand the > general idea behind it and how it works in some simple cases, but I don't > know enough to adapt it to my particular needs. >
Based on my analysis, the assumptions in MM are not always that accurate. It's partly what made me derive my own method. Of course, I simply used what I knew. > Nevertheless, Newton's method can be applied to each parameter one by > > one. If I understand correctly, this is what Jason proposed. One step of > > > > Newton's method on just one parameter has a computational cost very > > similar to MM, and is much more efficient (it should converge in just > > one or two iterations). > > > Hmmm... That sounds similar to just considering the diagonal part of the > Hessian matrix, or something of that sort. If the features are sort of > orthogonal in some sense (uncorrelated?), this should be a good > approximation, but I don't know how well this would apply to our particular > regression problems. > I doubt (very highly) that parameters are uncorrelated. > Remi, do you dump the training data to a simple text file and you then > feed that to a program that does the regression, or does the program compute > it on the fly from the database of positions? > I'm not Remi, but I can at least say that I plan to dump my data to a text file. I view that as having a few advantages. It removes logic from my engine that is almost orthogonal to its true purpose and allows me to easily share my tool with others (I plan to release it under GPLv3 when done). I'm expecting the iterative passes over the very large input set to be the limiting factor for the tool.
_______________________________________________ computer-go mailing list [email protected] http://www.computer-go.org/mailman/listinfo/computer-go/
