I'd love to see mlp in the scikit! best, Peter
2011/11/4 Andreas Müller <[email protected]>: > On 11/04/2011 02:59 PM, Lars Buitinck wrote: >> 2011/11/4 Andreas Müller <[email protected]>: >>> My question is: has anyone started with a mlp implementation yet? >> I was just working on one :) >> I have the predict function for an arbitrary number of hidden layers >> (classifier case) and some snippets of the RPROP algorithm. I've been >> using weight vectors that come out of a Matlab implementation for now. >> >> There used to be an MLP implementation in older versions (around 0.2, >> I believe) but it was abandoned. >> > Are you using pure Python at the moment? > Where can I find your code? And is the goal of your code to > be included in the scikits? > >>> My feature list would be: >>> - online, minibatch and batch learning >> I only need batch learning and classification for now... shall we keep >> it simple? >> > I think it is necessary to have minibatch learning and so I think > building that into the code from the beginning is good. > >>> - vanilla gradient descent and rprop >>> - l2 weight decay optional >>> - tanh nonlinearities >> Logistic activation functions seem fashionable; that's what Bishop and >> other textbooks use. I'm not sure if there's a big difference, but it >> seems to me that gradient computations might be slightly more >> efficient (guesswork, I admit). We can always add a steepness >> parameter later. > In my personal experience, tanh works better. LeCun uses tanh ;) > >> I've been reading the RPROP papers and it looks like IRPROP- is the >> algorithm to go for; it's simple and not significantly worse than >> RPROP+. We could look at the RPROP implementation in Wapiti (and maybe >> even copy bits of it, it's MIT-licensed). >> > RPROP is very easy to implement. I use it in my lab all the time. > I have no personal experience with IRPROP-? How is that different > than IRPROP? What is RPROP+? Can you give me references? > > Cheers, > Andy > > ------------------------------------------------------------------------------ > RSA(R) Conference 2012 > Save $700 by Nov 18 > Register now > http://p.sf.net/sfu/rsa-sfdev2dev1 > _______________________________________________ > Scikit-learn-general mailing list > [email protected] > https://lists.sourceforge.net/lists/listinfo/scikit-learn-general > -- Peter Prettenhofer ------------------------------------------------------------------------------ RSA(R) Conference 2012 Save $700 by Nov 18 Register now http://p.sf.net/sfu/rsa-sfdev2dev1 _______________________________________________ Scikit-learn-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
