Hey All,

I've been looking at adding Adagrad to SGD for a while now. Alex, Mathieu
and I were discussing the possibility of having a separate class for
Adagrad entirely. The benefits of this would be that the implementation of
SGD would not get muddled up and Adagrad could be implemented in a much
cleaner way than the stochastic_gradient.py and sgd_fast.pyx files. The
downside would be the great duplication of code. Basically any bug in SGD
would most likely be a bug in AdaptiveSGD as well.

It's just an idea I was toying around with, but I decided it would be a
good idea to reach out the community to get some feedback. Let me know what
you think.

Danny
------------------------------------------------------------------------------
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration & more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=157005751&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to