On Wed, Jul 11, 2012 at 8:22 PM, Andreas Müller wrote:
> Hi David.
> Congratulations for landing an internship at Google. Who are you working
> with?
>
I am working with people from natural language undestanding team (my
manager is Enrique Alfonseca), I opensource one of google's internal
librari
Hi,
I am sorry, but I won't be able to work on gsoc. Many things happened since
I started working on my project. I now live in Switzerland and work at
Google. I received the offer after I started working on my project. I am
very interested in scikit-learn, I thought it would be possible to work on
Hi,
I am using check_grad from scipy for gradient checking of my mlp
implementation and I am not sure I am using it right, because it doesn't
work even on very trivial implementation of mlp. I have written a small
implementation of computing a gradient of mlp:
https://gist.github.com/2959786 I thi
Hi
On Mon, Jun 18, 2012 at 11:43 AM, Olivier Grisel
wrote:
> Err, no. The paper I mentioned is even newer:
>
> http://arxiv.org/abs/1206.1106
>
> Just to make it more explicit about the paper content, the title is:
> "No More Pesky Learning Rates" and it's a method for estimating the
> optimal l
On Wed, Jun 6, 2012 at 1:50 PM, xinfan meng wrote:
>
> I think these two delta_o have the same meaning. If you have "Pattern
> Recognition and Machine Learning" by Bishop, you can find that Bishop use
> exactly the second formula in the back propagation algorithm. I suspect
> these two formulae le
Hi
On Wed, Jun 6, 2012 at 10:38 AM, xinfan meng wrote:
> Hi, all. I post this question to the list, since it might be related to
> the MLP being developed.
>
> I found two versions of the error function for output layer of MLP are
> used in the literature.
>
>
>1. \delta_o = (y-a) f'(z)
>
Hi,
As Gael and Olivier said, I am working on mlp this summer, it's my GSOC
project. So there is some existing code (in Cython) and you won't be able
to just use your class project, but you should definitely look at it. I
will be grateful for every help and suggestion. I have got basic
classificat
Hi Olivier,
I have missed this email and read it now. I have been working on my project
and I have got basic implementation running. I have published some info
about my current progress on my blog
http://www.davidmarek.cz/2012/06/03/mlp-progress/
I tried to give as much information to lazy reader
):
> Hey David.
> How is it going?
> I haven't heard from you in a while.
> Did you blog anything about your progress?
>
> Cheers,
> Andy
>
> Am 16.05.2012 12:15, schrieb David Marek:
> > On Tue, May 15, 2012 at 4:59 PM, David Warde-Farley
> > wr
On Tue, May 15, 2012 at 10:31 PM, Andreas Mueller
wrote:
> On the same topic: I'm not sure if we decided whether we want minibatch,
> batch and online learning.
> I have the feeling that it might be possible to do particular
> optimizations for online learning, and this
> is the algorithm that I f
Hi
Yes, I did. I am using gmail so I just quote one mail, didn't want to
answer each mail separately when they are so similar. Sorry, I will
try to be more specific in quoting.
David
16. 5. 2012 v 12:22, Andreas Mueller :
> Hi David.
> Did you also see this mail:
> http://permalink.gmane.org/gm
On Tue, May 15, 2012 at 4:59 PM, David Warde-Farley
wrote:
> On Tue, May 15, 2012 at 12:12:34AM +0200, David Marek wrote:
>> Hi,
>>
>> I have worked on multilayer perceptron and I've got a basic
>> implementation working. You can see it at
>> https://g
Hi,
I have worked on multilayer perceptron and I've got a basic
implementation working. You can see it at
https://github.com/davidmarek/scikit-learn/tree/gsoc_mlp The most
important part is the sgd implementation, which can be found here
https://github.com/davidmarek/scikit-learn/blob/gsoc_mlp/skl
Awesome
Congratulations to Immanuel and Vlad and huge thanks to everyone who
helped us with our proposals.
I am looking forward to working with you.
David
23. 4. 2012 v 23:15, Alexandre Gramfort :
> hi sklearners,
>
> I am happy to announce that the scikit-learn got 3 GSOC slots this year
> so
Hi Vlad
> Like Gael said in the other thread, try to submit your proposal quite before
> the deadline. You can still edit it on their site.
I have already submitted it.
> I agree with everybody regarding the importance of testing and examples. They
> are not afterthoughts. The documentation, t
Hi
On Fri, Mar 30, 2012 at 7:41 AM, Gael Varoquaux
wrote:
> David, I saw in another of you mails that you have more ideas of what
> could be done. I think that it would be useful to enrich your
> application.
I have incorporated all my ideas into the proposal. There are now few
more learning alg
On Sat, Mar 31, 2012 at 3:22 PM, Immanuel B wrote:
> Thanks both of you,
>> Do `make inplace` for the incremental build only of the C files that
>> have changed since the last build and then use `nosetests
>> sklearn/mypackage/module` to launch the tests only on your module.
> this did the trick.
Hi
On Sat, Mar 31, 2012 at 2:27 PM, Immanuel B wrote:
> I'm just starting to work on some cython files in scikit. It would
> great if someone could suggest
> me an easy way to compile them.
> Currently I'm running `cython` on the file and then make on
> scikit-learn. This seems to work but the se
Hi
> I'd emphasize that "SGD" is a class of algorithms, and the implementations
> that exist are purely for the linear classifier setting. I'm not sure how
> much use they will be in an SGD-for-MLP (they can maybe be reused for certain
> kinds of output layers), but there is definitely more work i
Hi,
I have created a first draft of my application for gsoc. I summarized
all ideas from last thread so I hope it makes sense. You can read it
at
https://docs.google.com/document/d/11zxSbsGwevd49JIqAiNz4Qb6cFYzHdJgH9ROegY9-qo/edit
I would like to ask Andreas and David to have a look. Every feedba
Hi
I think I was a little confused, I'll try to summarize what I
understand is needed:
* the goal is to have multilayer perceptron with stochastic gradient
descent and maybe other learning algorithms
* basic implementation of sgd already exists
* existing implementation is missing some loss-funct
On Mon, Mar 19, 2012 at 2:03 PM, Andreas wrote:
> Thanks for your email. I would appreciate it if we could keep the
> discussion on the list for as long as possible,
> as other people might have something to say or are interested.
Ok, my mistake. I wanted to get some general insight before
embarr
22 matches
Mail list logo