The problem with callbacks is that for callbacks on each iteration to be
feasible, they need to be cython functions.> Otherwise they will be too slow.
You could do python callbacks, but they could not be called at every iteration,
and so
they wouldn't be suitable to implement something like adagrad or adam.

We can implement some plug and play callbacks in Cython and pass a list of
strings in constructor of a linear model, deciding which callbacks to execute.
How does that sound Andreas ? I can really help with more thoughts about the
idea.
Regards,Karan.




On Wed, Mar 15, 2017 8:18 PM, Andreas Mueller [email protected]  wrote:






On 03/15/2017 04:48 AM, Karan Desai wrote:

4. About a tool to anneal learning rate: I suggest a new approach to > look at
this - as a callback. I searched through the documentation and > I could not
find this way of handling tidbits during training of > models. We should be able
to provide a callback to the constructor of > a linear model which can do any
dedicated job after every epoch, be it > learning rate annealing, saving model
checkpoint, getting custom > verbose output, or as creative as uploading data to
server for real > time plots on any website.

There has been some effort on doing adagrad but it was ultimately discontinued,
I think.

There was quite a bit of complexity to handle.

The problem with callbacks is that for callbacks on each iteration to be
feasible, they need to be cython functions.

Otherwise they will be too slow. You could do python callbacks, but they could
not be called at every iteration, and so

they wouldn't be suitable to implement something like adagrad or adam.




Best,

Andy

_______________________________________________

scikit-learn mailing list

[email protected]

https://mail.python.org/mailman/listinfo/scikit-learn
_______________________________________________
scikit-learn mailing list
[email protected]
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to