>> Of course there are many other possibilities like pretraining,
>> deeper networks, different learning rate schedules etc..
>> You are right, this is somewhat of an active research field
>> Though I have not seen conclusive evidence that any
>> of these methods are consistently better than a vanilla mlp.
> http://www.dumitru.ca/files/publications/icml_07.pdf the table on page 7
> makes a pretty compelling case, I'd say.
>
These numbers are weired.
A basic grid search with rbf svm gives 1.4% error on mnist.
Using a vanilla MLP with 500 hidden units and RPROP (no momentum or
weight decay)
and early stopping or cross-validating a constant
learning rate in the same setup gives 2%, I think.

> Now, there's also the results out of Juergen Schmidhuber's lab that show that
> if you train for months on a GPU, add all kinds of prior knowledge into the
> preprocessing pipeline, make careful choices about the learning rate
> schedule, initialization, and activation function (some of this is pretty
> easy and well-documented in that paper by Yann LeCun that Olivier sent around
> earlier in the thread, other parts will take a lot of fiddling), then you
> *can* make vanilla MLPs perform really well on MNIST, but this says more
> about the devotion of the practitioners to this (rather artificial) task, and
> the sorts of built-in prior knowledge they used, than it does about the
> strength of the learning algorithm.
>
Don't get me wrong. I'm not a fan of the MNIST focused research.
One of the reasons I want an MLP in sklearn is so it is easier
to compare with other learning algorithms on a wide range of
tasks.
I am pretty sceptical about neural networks myself but as
they scale very well, they definitely seem an alternative
to linear classification.

Cheers,
Andy

ps: I would have never imagined that at some point in my life
I'll argue _for_ mlps...  I think my advisor got to me.

------------------------------------------------------------------------------
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to