Re: [mlpack] Queries regarding GSoC

2018-02-13 Thread Marcus Edel
Hello Rajiv,

> I initially had 3 topics of choice(in the previous mail), but after going
> through the mailing list archive, mlpack blog and other links provided by 
> you, I
> have decided to go ahead with "ESSENTIAL DEEP LEARNING MODULES". I like the 
> list
> of suggested algorithms. I'll also see if I can think of anything. Please let 
> me
> know if you have any suggestions in mind.

Sounds good, we will update the project description in the next days and
probably add more suggestions.

> So, how should I go about from here? Is there any open ticket to work on? 
> Should
> I start thinking about the proposal?

If you don't mind I'd like to finish the open PR first, before opening another
PR, how does this sound? About the proposal, if you like you can start working
on it, the most important part is the timeline.

I hope anything I said is helpful.

Thanks,
Marcus

> On 13. Feb 2018, at 08:07, Rajiv Vaidyanathan  
> wrote:
> 
> Hi Marcus,
> 
> Firstly, congratulations for making into GSoC 2018! 
> 
> I initially had 3 topics of choice(in the previous mail), but after going 
> through the mailing list archive, mlpack blog and other links provided by 
> you, I have decided to go ahead with "ESSENTIAL DEEP LEARNING MODULES". I 
> like the list of suggested algorithms. I'll also see if I can think of 
> anything. Please let me know if you have any suggestions in mind.
> 
> So, how should I go about from here? Is there any open ticket to work on? 
> Should I start thinking about the proposal?
> 
> Also, I thought I should to finish the SPSA 
> optimizer(https://github.com/mlpack/mlpack/pull/1153 
> ) before I start working on a new 
> issue. I am stuck up in the implementation of the test code. I have posted 
> the latest error in the comments. It would be great if you could help me out.
> 
> Regards,
> Rajiv
> 
> On 3 February 2018 at 20:37, Rajiv Vaidyanathan 
> > wrote:
> 
> 
> Hi Marcus,
> 
> As of now, I am working on writing a test for SPSA optimizer which I have 
> implemented... I'll try to finish it ASAP. As of now, I cannot think of any 
> DL model good enough to replace the suggested ones... Also, as you said, lets 
> wait until Google officials confirms :)
> 
> Regards,
> Rajiv
> 
> On 3 February 2018 at 03:22, Marcus Edel  > wrote:
> Hello Rajiv,
> 
> Nice to hear from you again how are things going?
> 
> > I am interested in the following topics(listed in the order of interest):
> > 1. Reinforcement Learning
> > 2. Essential Deep Learning Modules
> > 3. Particle Swarm Optimization
> >
> > How should I go about it? I read a few Mailing List archives for 1 and 3. 
> > What
> > should I do after that? Can I start working on the proposal submission?
> 
> 
> Going through the mailing list archive is definitely a good starting point, 
> also
> the weekly updates from Kris and Shangtong
> (http://www.mlpack.org/gsocblog/index.html 
> ) could be interesting too.
> 
> The models listed for the Essential Deep Learning Modules idea are just
> suggestions, if you like to work on an interesting network model over the 
> summer
> please feel free to start a discussion.
> 
> A good starting point, in general, is to get familiar with the codebase, if 
> you
> have any question please don't hesitate to ask. About the submission that's up
> to you, but note Google hasn't announced the accepted organizations yet and
> there is plenty of time to prepare the proposal and get feedback.
> 
> I hope this is helpful, let us know if we should clarify anything.
> 
> Thanks,
> Marcus
> 
> > On 2. Feb 2018, at 16:19, Rajiv Vaidyanathan  > > wrote:
> >
> > Hi Marcus,
> >
> > I'm N Rajiv Vaidyanathan(github handle: rajiv2605). I have contributed to 
> > mlpack in the past and I really like this organisation. I am very 
> > interested in participating in GSoC.
> >
> > I am interested in the following topics(listed in the order of interest):
> > 1. Reinforcement Learning
> > 2. Essential Deep Learning Modules
> > 3. Particle Swarm Optimization
> >
> > How should I go about it? I read a few Mailing List archives for 1 and 3. 
> > What should I do after that? Can I start working on the proposal submission?
> >
> > Thanking you.
> >
> > Regards,
> > Rajiv
> >
> > ___
> > mlpack mailing list
> > mlpack@lists.mlpack.org 
> > http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack 
> > 
> 
> 
> 
> 
> ___
> mlpack mailing list
> mlpack@lists.mlpack.org
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack


Re: [mlpack] Implementation of matlab ksdensity function

2018-02-13 Thread Ryan Curtin
On Tue, Feb 13, 2018 at 01:30:42PM +, Angelo DI SENA wrote:
> Hi
> 
> I'm trying to "translate" a complex matlab script in C++ using armadillo.
> I implemented some functions that are not present in armadillo, but I'm 
> currently blocked with ksdensity
> In particular I need to use ksdensity with the following parameters
> 
> [ky kx] = ksdensity(vector,pts,'function','cdf');
> 
> for probability density estimate.
> 
> Is there any implementation of such function using mlpack/armadillo?

Hi Angelo,

It has been a long-standing open issue to implement fast tree-based
kernel density estimation: https://github.com/mlpack/mlpack/issues/150
But unfortunately there is no implementation available today.

However, if you are just looking for something simple (if vector or pts
is not too big), you can write a loop over all of the points in 'pts' to
calculate the normal distribution value.  There may be a little bit of
complexity involved---I don't know if ksdensity() does auto-tuning of
the bandwidth to be used, etc. (I have not used it before).

One class that may help in mlpack is the GaussianDistribution class in
src/mlpack/core/dists/gaussian_distribution.hpp.

I hope this helps; let me know if I can clarify anything.

Thanks!

Ryan

-- 
Ryan Curtin| "Oh man, I shot Marvin in the face."
r...@ratml.org |   - Vincent
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] Implementation of matlab ksdensity function

2018-02-13 Thread Angelo DI SENA
Hi Ryan

Thanks for your answer.
Hi partially understood your suggestion.
This is due to my poor knowledge of the math behind.

In the mathlab script I'm trying to  convert
Vector is  3000  element (1x3000)
With values between -1 and 1
Pts is a 200 vector(200X1)

From matlab documentation the result should be  200 pair of values (one for 
each element in pts)
So, what is not clear is how I should consider vector.
For each value in pts which values must be considered from vector?

Hope you can help on that

Angelo
 

-Original Message-
From: Ryan Curtin [mailto:r...@ratml.org] 
Sent: martedì 13 febbraio 2018 15:16
To: Angelo DI SENA 
Cc: mlpack@lists.mlpack.org
Subject: Re: [mlpack] Implementation of matlab ksdensity function

On Tue, Feb 13, 2018 at 01:30:42PM +, Angelo DI SENA wrote:
> Hi
> 
> I'm trying to "translate" a complex matlab script in C++ using armadillo.
> I implemented some functions that are not present in armadillo, but 
> I'm currently blocked with ksdensity In particular I need to use 
> ksdensity with the following parameters
> 
> [ky kx] = ksdensity(vector,pts,'function','cdf');
> 
> for probability density estimate.
> 
> Is there any implementation of such function using mlpack/armadillo?

Hi Angelo,

It has been a long-standing open issue to implement fast tree-based kernel 
density estimation: https://github.com/mlpack/mlpack/issues/150
But unfortunately there is no implementation available today.

However, if you are just looking for something simple (if vector or pts is not 
too big), you can write a loop over all of the points in 'pts' to calculate the 
normal distribution value.  There may be a little bit of complexity 
involved---I don't know if ksdensity() does auto-tuning of the bandwidth to be 
used, etc. (I have not used it before).

One class that may help in mlpack is the GaussianDistribution class in 
src/mlpack/core/dists/gaussian_distribution.hpp.

I hope this helps; let me know if I can clarify anything.

Thanks!

Ryan

-- 
Ryan Curtin| "Oh man, I shot Marvin in the face."
r...@ratml.org |   - Vincent
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

[mlpack] Regarding

2018-02-13 Thread Akash Shivram
Hey there!
Congratulations on getting into GSoC' 18!!

I was going through the organisations participating this year searching for
organisations working in ML and DL related field. I came across mlpack and
was delighted to see a project on RL!! I like RL and and wanted some
project to do in this field.
I have experience working with Neural Networks, Reinforcement Leaning, and
Deep Q Learning. As this is the first day of me with your repository,
I have gone through requirements for an applicant for 'Reinforcement
Learning' project and trying to go through as many papers listed as
possible.
Are there any more 'bonus' papers, or anything extra that wold be required.
Moreover, I am thinking of working on my application at the earliest this
week. Is that ok ? I am going through the code base and as I find something
to talk about/on, can I trouble you people with my questions? There might
be a lot, some even stupid !

Thank you

PS : This mail went too long!! Sorry for the long read !
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack