On Wed, Mar 15, 2017 at 11:13:12PM +0530, abhinav kannan wrote:
> Greetings, developers!
> 
> First and foremost, hearty congratulations to the mlpack team for being
> accepted in GSoC '17. I am honored to start an interaction with this
> hardworking and enthusiastic group.
> 
> I am Abhinav Kannan, a second year student of Computer Science at SRM
> University, Chennai, India. I am skilled at C++ and Python, with over three
> years of experience in the former, and am a lover of machine learning.
> Also, when it comes to computer studies, I am a very enthusiastic, fast and
> committed learner, having a knack of picking up concepts and solving
> problems by myself.
> 
> I have nearly completed Andrew Ng's machine learning course by Stanford
> University on Coursera (certificate due in two weeks!), and since the
> course has been a worthwhile challenge, I am looking to take up an
> assignment that is bigger and tougher, yet achievable, over the summer.
> The project "Parallel stochastic optimization methods" is one which I am
> motivated to begin work on. I am currently in the installation phase of
> mlpack, running Ubuntu. Since it has been taking some time working by
> myslef around the installation, I've decided to reach out here before it's
> too late, as in the meanwhile, I have done some background research on the
> topics related to the project, which are new to me, such as SCD. Now having
> a basic idea of it, I did notice the limitations as well in this technique
> here <https://en.wikipedia.org/wiki/Coordinate_descent#Limitations>[1] and
> here
> <http://stats.stackexchange.com/questions/146317/coordinate-vs-gradient-descent>
> [2].
> 
> With respect to this task, Prof Ng's course gives an in-depth understanding
> of optimization algorithms such as gradient descent, alongside linear
> regression and logistic regression, with convex functions. Also covered in
> the course are stochastic (and batch) gradient descent, neural networks,
> map-reduce and data parallelism. There have been tests on these topics in
> the course's programming assignments and quizzes, which I have completed,
> and hence am confident of my understanding in these topics.
> 
> I am also getting started with multi-threaded programming, as part of my
> coursework at university. I would love to begin work on this project soon
> (if not right away!), along with some inputs from the developers here. I
> feel contributing to mlpack will not just be an honor, but also give me a
> challenging, hands-on and memorable experience, besides enhancing my
> knowledge and helping me be part of the open source community.
> 
> Looking forward to an early response.

Hi Abhinav,

Thanks for getting in touch.  It sounds like you have already done a
good amount of searching, but just in case, here are two useful pages:

http://www.mlpack.org/gsoc.html
http://www.mlpack.org/involved.html

Those could be helpful for making your first contributions to the
library.  For the parallel stochastic optimization methods project, you
may want to take a look at the mailing list archives.  Here's one post
about it that describes how open-ended the project is:

https://mailman.cc.gatech.edu/pipermail/mlpack/2016-March/000752.html

Thanks,

Ryan

-- 
Ryan Curtin    | "Sometimes, I doubt your commitment to Sparkle
[email protected] | Motion!"  - Kitty Farmer
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to