On Apr 6, 2012, at 10:19 , Andreas Mueller wrote:
> On 04/06/2012 08:04 AM, xinfan meng wrote:
>>
>>
>> On Fri, Apr 6, 2012 at 1:57 PM, David Warde-Farley
>> wrote:
>> On 2012-04-05, at 5:17 PM, Vlad Niculae wrote:
>>
>> >
>> > http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.p
On 04/06/2012 08:04 AM, xinfan meng wrote:
On Fri, Apr 6, 2012 at 1:57 PM, David Warde-Farley
mailto:warde...@iro.umontreal.ca>> wrote:
On 2012-04-05, at 5:17 PM, Vlad Niculae mailto:zephy...@gmail.com>> wrote:
>
> http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.pn
On Fri, Apr 6, 2012 at 1:57 PM, David Warde-Farley <
warde...@iro.umontreal.ca> wrote:
> On 2012-04-05, at 5:17 PM, Vlad Niculae wrote:
>
> >
> > http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.png
> >
> > It is easy to set up the skeleton of such an example and when the
> implement
On 2012-04-05, at 5:17 PM, Vlad Niculae wrote:
>
> http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.png
>
> It is easy to set up the skeleton of such an example and when the
> implementation is good it will magically run :)
I have stared at way too many MNIST weight visualization
Actually I couldn't find the code but I found something better, the assignment
notes:
https://github.com/SaveTheRbtz/ml-class/blob/master/ex4.pdf
If you ran more iterations it would only get better. Looking back this was a
very good class.
Vlad
On Apr 6, 2012, at 06:54 , Vlad Niculae wrote:
On Apr 6, 2012, at 02:56 , Andreas Mueller wrote:
> On 04/05/2012 11:17 PM, Vlad Niculae wrote:
>> I would like to see a reproduction of the standard neural net digits example:
>>
>> http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.png
>>
> That looks like the weights of an autoenc
On 04/05/2012 11:17 PM, Vlad Niculae wrote:
> I would like to see a reproduction of the standard neural net digits example:
>
> http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.png
>
That looks like the weights of an autoencoder, right?
Autoencoders are not part of the plan as far as I
Hi Vlad
> Like Gael said in the other thread, try to submit your proposal quite before
> the deadline. You can still edit it on their site.
I have already submitted it.
> I agree with everybody regarding the importance of testing and examples. They
> are not afterthoughts. The documentation, t
hi David,
I've juste seen your application. I suggest you add dates to the different steps
so you can have clear objectives.
Alex
On Thu, Apr 5, 2012 at 10:55 PM, David Marek wrote:
> Hi
>
> On Fri, Mar 30, 2012 at 7:41 AM, Gael Varoquaux
> wrote:
>> David, I saw in another of you mails that y
Hi David,
Like Gael said in the other thread, try to submit your proposal quite before
the deadline. You can still edit it on their site.
I agree with everybody regarding the importance of testing and examples. They
are not afterthoughts. The documentation, though, can be left until the final
Hi
On Fri, Mar 30, 2012 at 7:41 AM, Gael Varoquaux
wrote:
> David, I saw in another of you mails that you have more ideas of what
> could be done. I think that it would be useful to enrich your
> application.
I have incorporated all my ideas into the proposal. There are now few
more learning alg
On Thu, Mar 29, 2012 at 09:58:22PM +0200, Andreas wrote:
> > Otherwise, it seems like a good proposal. As I said, it seems like a rather
> > small amount of actual implementation, even if you are only budgeting the
> > first half of the work period. I would look for some additional features to
> >
Hi
> I'd emphasize that "SGD" is a class of algorithms, and the implementations
> that exist are purely for the linear classifier setting. I'm not sure how
> much use they will be in an SGD-for-MLP (they can maybe be reused for certain
> kinds of output layers), but there is definitely more work i
> Otherwise, it seems like a good proposal. As I said, it seems like a rather
> small amount of actual implementation, even if you are only budgeting the
> first half of the work period. I would look for some additional features to
> flesh out the implementation side of the proposal.
>
I was n
On Thu, Mar 29, 2012 at 06:19:48PM +0200, David Marek wrote:
> Hi,
>
> I have created a first draft of my application for gsoc. I summarized
> all ideas from last thread so I hope it makes sense. You can read it
> at
> https://docs.google.com/document/d/11zxSbsGwevd49JIqAiNz4Qb6cFYzHdJgH9ROegY9-q
Hi David.
Content wise this looks reasonable, I think.
For the "must haves" I would add elastic net penalty
and "crammer singer loss" aka "multi-class hinge" (maybe
sometimes also called rank loss?). Both shouldn't be very hard.
For the nice-to-haves you could also add RPROP = resilient
back propa
Hi,
I have created a first draft of my application for gsoc. I summarized
all ideas from last thread so I hope it makes sense. You can read it
at
https://docs.google.com/document/d/11zxSbsGwevd49JIqAiNz4Qb6cFYzHdJgH9ROegY9-qo/edit
I would like to ask Andreas and David to have a look. Every feedba
17 matches
Mail list logo