> In my opinion, Adagrad is still on the lower side in terms of number of
> citations (currently 107 according to Google) for inclusion into scikit-learn.
> So unless there is strong evidence that it outperforms other solvers (e.g., in
> computer vision or NLP papers that use Adagrad), I'm personally -1 for its
> inclusion into scikit-learn.

Agreed. Also, there is currently a lot of progress on improving
stochastic gradient based solvers. In particular at the latest NIPS and
ICML. Ideally, I think that it is best to wait a little bit for the dust
to settled down and then implement what comes out as the best option (I
think that my favorite is SAG, but let's wait and see).

------------------------------------------------------------------------------
Rapidly troubleshoot problems before they affect your business. Most IT 
organizations don't have a clear picture of how application performance 
affects their revenue. With AppDynamics, you get 100% visibility into your 
Java,.NET, & PHP application. Start your 15-day FREE TRIAL of AppDynamics Pro!
http://pubads.g.doubleclick.net/gampad/clk?id=84349831&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to