On Tue, Mar 06, 2012 at 10:22:08PM +0100, Andreas wrote:
> > I just tried my simple implementation, relying on scipy's BFGS, and it
> > took approx. 1s to train on an artificial dataset with (n_samples=10000,
> > n_features=20, n_classes=10), 15s on (n_samples=10000, n_features=100,
> > n_classes=10). So I think, it can be ok for medium scale.

> Alex, Gael, what do you think about that?

Don't know. Seems a bit slow to me, but that's a situation in which I
badly want speed. What's the penalty (type and amount)? How does it
compare to liblinear on a two-class problem?

> Having some base implementation that can be improved with a better
> optimizer later seems as a reasonable starting point.

Well, before committing to anything, I would first want to know that it
is the right algorithmic strategy, elsewhere I am afraid that we will
have our design constrained.

Gael

------------------------------------------------------------------------------
Keep Your Developer Skills Current with LearnDevNow!
The most comprehensive online learning library for Microsoft developers
is just $99.99! Visual Studio, SharePoint, SQL - plus HTML5, CSS3, MVC3,
Metro Style Apps, more. Free future releases when you subscribe now!
http://p.sf.net/sfu/learndevnow-d2d
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to