2013/2/5 Lars Buitinck <l.j.buiti...@uva.nl>:
> 2013/2/5 Olivier Grisel <olivier.gri...@ensta.org>:
>> Actually after reading @larsmans implementation I think we could
>> indeed start to investigate independently in pure python code and
>> later think whether it's worth to make it more interoperable with the
>> cython code of the MLP pull request.
>
> Yes, it's very simple to implement. The main thing that needs to be
> solved is the API. Actually, your remark about stacking something else
> than least-squares on top of a random hidden layer made me think:
> would it be a good idea to implement this as a transformer, say
> RandomSigmoid or RandomHiddenLayer?

A RandomSigmoidTransformer that implements only the hidden layer
projection / kernel expansion could be an interesting alternative to
Nystroem / AdditiveChi2 samplers.

But a classifier using a simple least square
(for instance  named ExtremeLearningMachineClassifier) that implements
the Classifier API
(possibly with both class_weight and sample_weight support for
boosting) would be also very user friendly.

> (I'm not sure which module that would fit in; for the theoreticians
> among us, would such a transformer count as a GLM with link function
> tanh^{-1}?)

Argl. This does not feel like linear model model at all. I would
rather put the transformer in sklearn.kernel_approximation .

--
Olivier
http://twitter.com/ogrisel - http://github.com/ogrisel

------------------------------------------------------------------------------
Free Next-Gen Firewall Hardware Offer
Buy your Sophos next-gen firewall before the end March 2013 
and get the hardware for free! Learn more.
http://p.sf.net/sfu/sophos-d2d-feb
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to