Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Vlad Niculae
Why would autoencoders be naturally batch? I think historically one of their early uses was for Online PCA, but I may be wrong. Vlad On Wed, Jun 26, 2013 at 11:51 PM, Issam wrote: > Hi @Olivier, you are absolutely right, scipy.optimize.fmin_l_bfgs_b > would not be suitable for MLP because some p

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Issam
Hi @Olivier, you are absolutely right, scipy.optimize.fmin_l_bfgs_b would not be suitable for MLP because some practitioners would want on-line updating (partial_fit()) rather than batch. However, what's your opinion about the use of 'fmin_l_bfgs_b' in naturally BATCH processing algorithms like

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Olivier Grisel
2013/6/26 Issam : > Hi @Hannes, how about using scipy.optimize.fmin_l_bfgs_b for optimizing > the weights? I found it to be very efficient and fast (I even found it > to be faster than MATLAB's minFunc), it's also widely used for neural > networks type of optimization like in Prof. Andrew's courses

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Issam
Hi @Hannes, how about using scipy.optimize.fmin_l_bfgs_b for optimizing the weights? I found it to be very efficient and fast (I even found it to be faster than MATLAB's minFunc), it's also widely used for neural networks type of optimization like in Prof. Andrew's courses and even in Deep Lear

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Issam
Thanks! that does sound very easy. I'll get into Cython soon! I have pushed a draft version of the Sparse Autoencoder to scikit's github. Hopefully I have sent the pull request correctly :). Thanks! On 6/26/2013 2:28 AM, Robert Layton wrote: The basics of cython are, and I'm not kidding her

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Hannes Schulz
Before things diverge completely, please also have a look at https://github.com/temporaer/scikit-learn/tree/mlperceptron and the discussions at https://github.com/larsmans/scikit-learn/pull/5 where I tried to refactor larsmans' code and the gradient descent into activity and weight layers, and

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Lars Buitinck
2013/6/26 Robert Layton : > The basics of cython are, and I'm not kidding here, quite easy to learn. > Steps: > 1) Rename .py file to .pyc You mean .pyx. > 2) Put "int" in front of all object declarations that will be integers, > "float" in front of things that are floats. (If you know java/C/C++

Re: [Scikit-learn-general] Adding Sparse Autoencoder to Scikit

2013-06-26 Thread Peter Prettenhofer
I strongly recommend reading Jake's blog entries on Cython (Memoryviews in particular) [1] and Wes' blog [2],[3]. Another great resource is the ball_tree.pyx code in /sklearn/neighbors/ball_tree.pyx . when you compile the pyx file to c using cython you should use the flag "-a" - it will generate a