Hi everybody.
I was thinking about putting some work into making a multi layer
perceptron implementation
for sklearn. I think it would be a good addition to the other, mostly
linear, classifiers
in sklearn. Together with the decision trees / boosting that many people
are working
on at the moment, I think sklearn would cover most of the classifiers
used today.

My question is: has anyone started with a mlp implementation yet? Or is
there any
code lying around that people think is already pretty good?
I would try to keep it simple with support only for one hidden layer and do
a pure python implementation to start with.

I'm also open for any suggestions.

My feature list would be:
- online, minibatch and batch learning
- vanilla gradient descent and rprop
- l2 weight decay optional
- tanh nonlinearities
- a class for regression and one for classification
- MSE and cross entropy (for classification only) loss functions

I think that would be a reasonable amount of features and should
be pretty easy to maintain.

Cheers,
Andy

------------------------------------------------------------------------------
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to