Hi

I've gone through the paper on efficient backpropagation that was
provided on the ideas page. I could follow the paper only up to
section 5 (Convergence of Gradient Descent), and that is because this
is the minimum that is covered in most tutorials on neural networks.

My basic doubt is about whether the literature provided on the ideas
page (for this project and for other projects as well) is just a
pointer, or are candidates expected to code based explicitly on the
literature. If it is the latter, then we will have to spend some time
studying the material in depth. But this will be difficult as we will
be expected to spend most of time coding during the summer. Or is it
the case that we are supposed to familiarize ourselves with the
material before the application deadline? If that is the case, I'd
like to know how much expertise is needed in this particular project,
with respect to the said paper.

I'm going to apply for this project under GSoC, and here are my pros and cons.

Pros: I'm taking a course in neural networks this semester, so this
particular project will have a minimum overhead for me. Although I'm
officially taking this course now, in my final undergrad year, but
I've been using machine learning techniques for a long time. I'm
familiar with MLPs, clustering and the use of linear transformations
as feature extraction. I'm somewhat obsessive about my understanding
of these tools being very clear. I tend to write everything from
scratch and I use available libraries only when I'm sure that I know
what they do. For instance, before I use most estimators in sklearn, I
write my own code and test it on small datasets and make sure that
they produce similar outputs with sklearn estimators. (Of course, the
notion of this similarity of results is subjective, I don't know how
to establish *rigorously* that my results are similar to those of the
estimators in sklearn. For most purposes I rely on least-squared
error.) In fact, I was also somewhat surprised that sklearn doesn't
begin with neural networks. Then I found out that sklearn is meant for
applying machine learning directly, without too much theory.

Cons: I've never actively contributed to any open source library, as
I've only recently become acquainted with the concept of 'community
coding'. I discovered the PEP 8 style only a couple of months ago.
I've learnt OOP very recently too, but now that I know it, I've
started thinking in terms of classes and their instances almost
everywhere in my code (Is that weird?). In fact, my style of coding
has been called 'too MATLABish' :) Basically all my education in
open-source coding happened after SciPy India 2011, which happened in
December.

In summary, I'm not very confident about how the community will feel
about my code, but I do make sure it does its job. And I can always
learn the soft-skills required for contributing to the community.

Here's a sample code of a Perceptron class that I wrote anew when I
learnt OOP. https://gist.github.com/2188832
Please go through it and be as critical as possible.

So does this warrant a GSoC application?

Cheers,

------------------------------------------------------------------------------
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to