Re: [Scikit-learn-general] I must step off gsoc

2012-07-12 Thread David Marek
On Wed, Jul 11, 2012 at 8:22 PM, Andreas Müller wrote: > Hi David. > Congratulations for landing an internship at Google. Who are you working > with? > I am working with people from natural language undestanding team (my manager is Enrique Alfonseca), I opensource one of google's internal librari

[Scikit-learn-general] I must step off gsoc

2012-07-11 Thread David Marek
Hi, I am sorry, but I won't be able to work on gsoc. Many things happened since I started working on my project. I now live in Switzerland and work at Google. I received the offer after I started working on my project. I am very interested in scikit-learn, I thought it would be possible to work on

[Scikit-learn-general] Gradient checking for mlp

2012-06-20 Thread David Marek
Hi, I am using check_grad from scipy for gradient checking of my mlp implementation and I am not sure I am using it right, because it doesn't work even on very trivial implementation of mlp. I have written a small implementation of computing a gradient of mlp: https://gist.github.com/2959786 I thi

Re: [Scikit-learn-general] multilayer perceptron questions

2012-06-19 Thread David Marek
Hi On Mon, Jun 18, 2012 at 11:43 AM, Olivier Grisel wrote: > Err, no. The paper I mentioned is even newer: > > http://arxiv.org/abs/1206.1106 > > Just to make it more explicit about the paper content, the title is: > "No More Pesky Learning Rates" and it's a method for estimating the > optimal l

Re: [Scikit-learn-general] Error function in the output layer of MLP

2012-06-06 Thread David Marek
On Wed, Jun 6, 2012 at 1:50 PM, xinfan meng wrote: > > I think these two delta_o have the same meaning. If you have "Pattern > Recognition and Machine Learning" by Bishop, you can find that Bishop use > exactly the second formula in the back propagation algorithm. I suspect > these two formulae le

Re: [Scikit-learn-general] Error function in the output layer of MLP

2012-06-06 Thread David Marek
Hi On Wed, Jun 6, 2012 at 10:38 AM, xinfan meng wrote: > Hi, all. I post this question to the list, since it might be related to > the MLP being developed. > > I found two versions of the error function for output layer of MLP are > used in the literature. > > >1. \delta_o = (y-a) f'(z) >

Re: [Scikit-learn-general] Contributing to scikit-learn

2012-06-05 Thread David Marek
Hi, As Gael and Olivier said, I am working on mlp this summer, it's my GSOC project. So there is some existing code (in Cython) and you won't be able to just use your class project, but you should definitely look at it. I will be grateful for every help and suggestion. I have got basic classificat

Re: [Scikit-learn-general] GSoC progress reports on the mailing list

2012-06-02 Thread David Marek
Hi Olivier, I have missed this email and read it now. I have been working on my project and I have got basic implementation running. I have published some info about my current progress on my blog http://www.davidmarek.cz/2012/06/03/mlp-progress/ I tried to give as much information to lazy reader

Re: [Scikit-learn-general] multilayer perceptron questions

2012-05-31 Thread David Marek
): > Hey David. > How is it going? > I haven't heard from you in a while. > Did you blog anything about your progress? > > Cheers, > Andy > > Am 16.05.2012 12:15, schrieb David Marek: > > On Tue, May 15, 2012 at 4:59 PM, David Warde-Farley > > wr

Re: [Scikit-learn-general] multilayer perceptron questions

2012-05-16 Thread David Marek
On Tue, May 15, 2012 at 10:31 PM, Andreas Mueller wrote: > On the same topic: I'm not sure if we decided whether we want minibatch, > batch and online learning. > I have the feeling that it might be possible to do particular > optimizations for online learning, and this > is the algorithm that I f

Re: [Scikit-learn-general] multilayer perceptron questions

2012-05-16 Thread David Marek
Hi Yes, I did. I am using gmail so I just quote one mail, didn't want to answer each mail separately when they are so similar. Sorry, I will try to be more specific in quoting. David 16. 5. 2012 v 12:22, Andreas Mueller : > Hi David. > Did you also see this mail: > http://permalink.gmane.org/gm

Re: [Scikit-learn-general] multilayer perceptron questions

2012-05-16 Thread David Marek
On Tue, May 15, 2012 at 4:59 PM, David Warde-Farley wrote: > On Tue, May 15, 2012 at 12:12:34AM +0200, David Marek wrote: >> Hi, >> >> I have worked on multilayer perceptron and I've got a basic >> implementation working. You can see it at >> https://g

[Scikit-learn-general] multilayer perceptron questions

2012-05-14 Thread David Marek
Hi, I have worked on multilayer perceptron and I've got a basic implementation working. You can see it at https://github.com/davidmarek/scikit-learn/tree/gsoc_mlp The most important part is the sgd implementation, which can be found here https://github.com/davidmarek/scikit-learn/blob/gsoc_mlp/skl

Re: [Scikit-learn-general] GSOC 12' 3/3 !!!

2012-04-23 Thread David Marek
Awesome Congratulations to Immanuel and Vlad and huge thanks to everyone who helped us with our proposals. I am looking forward to working with you. David 23. 4. 2012 v 23:15, Alexandre Gramfort : > hi sklearners, > > I am happy to announce that the scikit-learn got 3 GSOC slots this year > so

Re: [Scikit-learn-general] gsoc application MLP

2012-04-05 Thread David Marek
Hi Vlad > Like Gael said in the other thread, try to submit your proposal quite before > the deadline. You can still edit it on their site. I have already submitted it. > I agree with everybody regarding the importance of testing and examples. They > are not afterthoughts. The documentation, t

Re: [Scikit-learn-general] gsoc application MLP

2012-04-05 Thread David Marek
Hi On Fri, Mar 30, 2012 at 7:41 AM, Gael Varoquaux wrote: > David, I saw in another of you mails that you have more ideas of what > could be done. I think that it would be useful to enrich your > application. I have incorporated all my ideas into the proposal. There are now few more learning alg

Re: [Scikit-learn-general] compiling cython files in scikit-learn

2012-03-31 Thread David Marek
On Sat, Mar 31, 2012 at 3:22 PM, Immanuel B wrote: > Thanks both of you, >> Do `make inplace` for the incremental build only of the C files that >> have changed since the last build and then use `nosetests >> sklearn/mypackage/module` to launch the tests only on your module. > this did the trick.

Re: [Scikit-learn-general] compiling cython files in scikit-learn

2012-03-31 Thread David Marek
Hi On Sat, Mar 31, 2012 at 2:27 PM, Immanuel B wrote: > I'm just starting to work on some cython files in scikit. It would > great if someone could suggest > me an easy way to compile them. > Currently I'm running `cython` on the file and then make on > scikit-learn. This seems to work but the se

Re: [Scikit-learn-general] gsoc application MLP

2012-03-29 Thread David Marek
Hi > I'd emphasize that "SGD" is a class of algorithms, and the implementations > that exist are purely for the linear classifier setting. I'm not sure how > much use they will be in an SGD-for-MLP (they can maybe be reused for certain > kinds of output layers), but there is definitely more work i

[Scikit-learn-general] gsoc application MLP

2012-03-29 Thread David Marek
Hi, I have created a first draft of my application for gsoc. I summarized all ideas from last thread so I hope it makes sense. You can read it at https://docs.google.com/document/d/11zxSbsGwevd49JIqAiNz4Qb6cFYzHdJgH9ROegY9-qo/edit I would like to ask Andreas and David to have a look. Every feedba

Re: [Scikit-learn-general] scikit-learn gsoc idea: Neural Networks

2012-03-20 Thread David Marek
Hi I think I was a little confused, I'll try to summarize what I understand is needed: * the goal is to have multilayer perceptron with stochastic gradient descent and maybe other learning algorithms * basic implementation of sgd already exists * existing implementation is missing some loss-funct

Re: [Scikit-learn-general] scikit-learn gsoc idea: Neural Networks

2012-03-20 Thread David Marek
On Mon, Mar 19, 2012 at 2:03 PM, Andreas wrote: > Thanks for your email. I would appreciate it if we could keep the > discussion on the list for as long as possible, > as other people might have something to say or are interested. Ok, my mistake. I wanted to get some general insight before embarr