Hello,
I was wondering why there isn't a classic neural network implementation in
scikit-learn (a multilayer perceptron). This could have varying levels of
complexity: it could be hardcoded to just one hidden layer, allowing one to
specify the type of neurons in it (sigmoid, tanh, rectified linear etc.),
the learning rate and values for weight decay and momentum.
It could also be made to accept multiple hidden layers, with the ability to
specify the number of neurons and their type for each one.
Has this been considered before but no one has gotten around to it? Would
it be of interest for you?
There are of course more sophisticated methods that would be nice to have
as well. I'm only asking about the basic type because that is what I
currently would be willing to help with, but it would be great if more were
under consideration.
------------------------------------------------------------------------------
BPM Camp - Free Virtual Workshop May 6th at 10am PDT/1PM EDT
Develop your own process in accordance with the BPMN 2 standard
Learn Process modeling best practices with Bonita BPM through live exercises
http://www.bonitasoft.com/be-part-of-it/events/bpm-camp-virtual- event?utm_
source=Sourceforge_BPM_Camp_5_6_15&utm_medium=email&utm_campaign=VA_SF
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general