On Sat, May 26, 2012 at 11:37 PM, Ariel Rokem <[email protected]> wrote:
>
> Yes - that seems to work fine for this problem. However, I would like
> to also apply this to larger problems and I am worried that ridge
> regression will eventually become very slow. That's why I was trying
> out SGD.
>
>
In Ridge, you can increase the tolerance parameter to accelerate the
termination of the algorithm (only effective when using sparse conjugate
gradient, which is the default for sparse data).
BTW - is there any way to track the progress of these algorithms, as
> they are running? That is, print out something every n iterations
> reporting on the convergence of the error?
>
Not yet. This is something we should consider adding, in my opinion.
Here's an idea for a possible API:
def callback(estimator, event, **kw):
if event == "new-iteration":
print "Score at iteration %d: %f" % (kw["iteration"],
estimator.score(X, y))
elif event == "converged":
print "Algorithm converged at iteration %d" % kw["iteration"]
clf = Ridge(callback=callback)
clf.fit(X, y)
This way, people could monitor the events they are interested in.
Mathieu
------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and
threat landscape has changed and how IT managers can respond. Discussions
will include endpoint security, mobile security and the latest in malware
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general