Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-20 Thread Chintan Soni
Hello Marcus,

> Do you mean something like "COPSO: Constrained Optimization via PSO
> algorithm"
> by A. H. Aguirre et al.?

I was actually referring to constraint handling techniques, like
section 3 of the paper you mentioned covers (I guess the question was
poorly phrased, my bad). That being said, this paper actually turned
out to be quite useful because:
1. It covers a few constraint handling techniques, which I can dig deeper into.
2. From the paper: "He and Prempain used a “fly-back” mechanism that
returns an unfeasible particle to its previous feasible position [23].
An important drawback of this technique is the requirement of an
all-feasible initial population."

This flyback mechanism is exactly what the MOPSO paper suggested to
handle constraints, so I might also have to look into improvements on
that.

Thanks for the help!

Regards,
Chintan
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-20 Thread Marcus Edel
Hello Chintan,

Do you mean something like "COPSO: Constrained Optimization via PSO algorithm"
by A. H. Aguirre et al.?

Best,
Marcus

> On 20. Mar 2018, at 11:08, Chintan Soni  wrote:
> 
> Hello Marcus,
> 
> I have looked into Augmented Lagrangian for constrained PSO in depth as we 
> discussed before ([1]), can you also please provide a reference to another 
> constraint evaluation method to compare with? Thanks in advance.
> 
> Regards,
> Chintan
> 
> Links used:
> [1]: https://link.springer.com/article/10.1007/s00158-006-0032-z 
> 
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-20 Thread Chintan Soni
Hello Marcus,

I have looked into Augmented Lagrangian for constrained PSO in depth as we
discussed before ([1]), can you also please provide a reference to another
constraint evaluation method to compare with? Thanks in advance.

Regards,
Chintan

Links used:
[1]: https://link.springer.com/article/10.1007/s00158-006-0032-z
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-17 Thread Chintan Soni
Hello Marcus,

Just submitted an initial PR about half an hour ago.
There were some minor styling errors which were fixed quite easily,
but while looking for the index of the best particle using
arma::index_min(), the code works correctly on my machine, but the
Jenkins build fails.
The error message is:

/pso/pso_impl.hpp:99:30: error: ‘index_min’ is not a member of ‘arma’
   arma::uword bestParticle = arma::index_min(particleBestFitnesses);

I double checked if there were any similar errors on my machine and
there were none. Is there something I'm missing out?

Thanks in advance for the help.

Regards,
Chintan
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-16 Thread Chintan Soni
Hello Marcus,

> Sounds good, don't feel obligated to work on the parts before GSoC has
started,
> but it might be a good opportunity to get familiar with the codebase,
your call.

Just finished a basic implementation of the lbest variant of PSO. It was
indeed a good opportunity to get familiar with the codebase.

Tested it out on the Rastrigin and Rosenbrock functions (8 particles, 2
dimensions, 1000 iterations) and the tests passed. Will be testing more
rigorously tomorrow, add proper comments/documentation, and will also
submit a PR.

Small doubt: what format are we using for documenting mathematical formulae
in the code? Is it Tex?

I was also wondering if using OpenMP would speed up the execution of the
algorithm, since PSO is embarrassingly parallel. I am not sure if OpenBLAS
uses threading by default, so it is difficult to comment on the outcome
without running a few timing tests. Will be carrying out those as well on
substantially larger parameters and more particles to confirm results.

As for the application, it is still a WIP; should get a decent first draft
done over the weekend.

Thanks for all the help.

Regards,
Chintan.
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-14 Thread Chintan Soni
Hello Marcus,

> Really like the ideas, and I think this is something that could be
explored,
> might be a neat research component.

Thanks for all the positive feedback.

> Sounds good, don't feel obligated to work on the parts before GSoC has
started,
> but it might be a good opportunity to get familiar with the codebase,
your call.

Okay, will try to find a good balance between working on the application
and the code, with the application at higher priority. The initial part is
ready, but there is hardly any meaningful content in the document; just the
first few questions answered. I'll finish up the basic skeleton (or at
least the major parts) and share it for a first review.

> No, if you have one, you could add a reference, if not that's fine, no
worries
> this are optional informations.

Glad to know that, as I don't have one yet. If time permits, might try
making one. That's a big IF though.

I'll post an update as soon as possible, most likely regarding the
application.

Thanks and regards,
Chintan
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-13 Thread Chintan Soni
Hello Marcus,

I went through the PR you mentioned a bit more thoroughly; I had
missed the use of the velocity update policy. It's probably the best
way to incorporate multiple variants of the algorithm.

>> Looks really interesting, the "Better initialization" point is definitely 
>> something that might affect the overall training time in a positive way.

Just had an idea regarding this, it involves slightly modifying the
way iterations are run as follows:

1. Randomly distribute particles over the search space (standard
initialization method)

2. User specifies an initial point, currently what f.GetInitialPoint()
does for the test functions. This is usually the weight matrix saved
from a run of a training algorithm.

3. We run 1 (or a very small number of) iteration(s) on the randomly
distributed particles and evaluate the fitness values of each
particle.

4. Pick the worst particle from the swarm and replace it with the user
provided initialization point (if it is better than this "worst", else
discard it completely).

The above steps ensure that each run of the algorithm makes use of the
user provided initial point, and benefits from the random distribution
as well. This method can also be extended to provide a better initial
point for gradient descent:

5. Run the standard PSO algorithm for x% of the max iterations, and at
the end of the x% iterations, evaluate the best particle. This
particle will serve as the initialization point for the gradient
descent step.

6. Run GD for the next (100-x)% iterations.

It goes without saying that the algorithm will return as soon as the
fitness reaches the tolerance threshold.

The value of x depends mostly on the objective function complexity; if
the objective function is computationally straining to evaluate, a
smaller value of x will favour gradient descent over PSO and still
influence training time in a positive way, whereas if the objective
function is relatively simpler, PSO might converge faster. It might
help to make x configurable as a user input.

Regarding the time and resources being consumed by the build process,
I have disabled tests entirely for the time being and linked against
the generated library with external code; this has brought down the
build time down to a few minutes and is much more resource friendly.

I have set up a basic layout with some files (imitating the structure
of other optimizers) and started coding; it might take some time to
have a basic working PSO as I only have a few hours each weekday and
mostly get work done over the weekend. I will submit a PR as soon as
possible.

Another doubt: the application guide mentions adding a link to
"Homepage", is it necessary to have one?

Please let me know your thoughts about the approach mentioned above.
Will keep you posted about my progress.

Thanks and regards,
Chintan
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-12 Thread Chintan Soni
Hello Marcus,

> That is correct, if you like to work on it, please feel free.

Yes sure, I intend to have atleast a basic PSO working before the proposal
submission deadline; it will help get down the details better.

>> Glad to know that, I'll also look into the idea of using lambdas, and
>> see if I can come up with a comparison between the two.
>
> Sounds good, that might be a good point to include in the application.

Thanks, I'll keep that in mind while writing the application.

> Google recommends, google docs, which makes it super easy to provide
feedback.

Yeah, that's easier to handle in most cases.

I also tried disabling the python bindings, the build process seems to be
much smoother than before. I was thinking of building the mlpack_test cli
executable to quickly add and run tests after building; that, however,
requires building the cli executables, and it wouldn't hurt bypassing
those. I was thinking of manually linking against the generated library and
writing tests externally; is there another approach which is more commonly
used by everyone?

Thanks for all the help, I will keep you updated regarding my progress.
Also, should I wait for the existing PR ([1]) to be merged before adding my
code to it? Or is it better to just submit code written in a different
directory (pso_lbest)? The former might make it easier to integrate the two
approaches (gbest and lbest), but it might make adding support for GD a bit
tricky.

Thanks and regards,
Chintan

Links used:
[1] https://github.com/mlpack/mlpack/pull/1225
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-11 Thread Chintan Soni
Hello Marcus,

> There is already an open PR for the standard PSO
> https://github.com/mlpack/mlpack/pull/1225 that should be helpful.

I'll check it out thoroughly today. I went over it briefly and noticed
it involves the gbest PSO; maybe I can start with lbest PSO then?

> Agreed, that is definitely an option we can adapt for the PSO method.

Glad to know that, I'll also look into the idea of using lambdas, and
see if I can come up with a comparison between the two.

> Agreed, another FunctionType is a good idea to represent multiple functions, 
> one
> idea is to use variadic templates to pass multiple functions, that would allow
> us to use the same interface.-

Yes, variadic templates seem like the right option for now.

> Right, the codebase does use a lot of memory in the build step, one idea is to
> turn off the python bindings and the executables:
>
> cmake  -DBUILD_CLI_EXECUTABLES=OFF -DBUILD_PYTHON_BINDINGS=OFF ..
>
> you can also check if adding -DDEBUG=ON helps. Also, maybe there is an option 
> to
> increase/add Swap?

I'll turn off the python bindings and executables and try building
once again. I have been using the DEBUG flag already, so I'll look
into adding swap area next.

Thanks for all the help. I'll begin drafting a proposal in a day or
two. Is there a specific format for the proposal? I went through the
Application Guide; there was no mention of a format, just guidelines
and required content in the proposal. Will a simple Google Doc do?

Thanks and regards,
Chintan
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-11 Thread Marcus Edel

Hello Chintan,

> I've been through the MOPSO with GD approach as we discussed earlier, and I
> certainly like the idea. My thoughts as of now are:
> 
> (1) Add a basic PSO optimizer to the existing optimization API might refer to
> (the gradient descent optimizer code for help/coding style).

There is already an open PR for the standard PSO
https://github.com/mlpack/mlpack/pull/1225 that should be helpful.

> (2) Add support for contraint based optimization the AugLagrangian class has
> (some support for equality constraints by calculating the associated penalty 
> and
> (keeping it under a threshold; maybe a similar approach will work here?).

Agreed, that is definitely an option we can adapt for the PSO method.

> (3) Extend the functionality to multi-objective optimization, this might 
> require
> (reworking the Evaluate) methods of FunctionTypes to evaluate the position for
> (multiple objective functions; perhaps a better approach would be to add 
> another
> (FunctionType which will be used by the MOPSO optimizer
> (MultiObjectiveFunctionType?).

Agreed, another FunctionType is a good idea to represent multiple functions, one
idea is to use variadic templates to pass multiple functions, that would allow
us to use the same interface.-

> I also had a small doubt: is there a minimum amount of RAM/resources I need to
> have on my system? I am running a Fedora 27 on a Core i5 with 4GB of RAM, and 
> I
> cannot keep another application open while building the code (I switched from
> the distributed tar file to a clone of the repo), not even atom. Should I
> consider getting a RAM upgrade?

Right, the codebase does use a lot of memory in the build step, one idea is to
turn off the python bindings and the executables:

cmake  -DBUILD_CLI_EXECUTABLES=OFF -DBUILD_PYTHON_BINDINGS=OFF ..

you can also check if adding -DDEBUG=ON helps. Also, maybe there is an option to
increase/add Swap?

Let me know if I should clarify anything.

Thanks,
Marcus

> On 11. Mar 2018, at 11:39, Chintan Soni  wrote:
> 
> Hi Marcus,
> 
> I've been through the MOPSO with GD approach as we discussed earlier, and I 
> certainly like the idea. My thoughts as of now are:
> 
> (1) Add a basic PSO optimizer to the existing optimization API (might refer 
> to the gradient descent optimizer code for help/coding style).
> 
> (2) Add support for contraint based optimization (the AugLagrangian class has 
> some support for equality constraints by calculating the associated penalty 
> and keeping it under a threshold; maybe a similar approach will work here?).
> 
> (3) Extend the functionality to multi-objective optimization, this might 
> require reworking the Evaluate() methods of FunctionTypes to evaluate the 
> position for multiple objective functions; perhaps a better approach would be 
> to add another FunctionType which will be used by the MOPSO optimizer 
> (MultiObjectiveFunctionType?).
> 
> What are your thoughts about this? Especially regarding (2)?
> 
> I also had a small doubt: is there a minimum amount of RAM/resources I need 
> to have on my system? I am running a Fedora 27 on a Core i5 with 4GB of RAM, 
> and I cannot keep another application open while building the code (I 
> switched from the distributed tar file to a clone of the repo), not even 
> atom. Should I consider getting a RAM upgrade?
> 
> Thanks and regards,
> Chintan

___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Re: [mlpack] GSoC 2018: Particle Swarm Optimization

2018-03-11 Thread Chintan Soni
Hi Marcus,

I've been through the MOPSO with GD approach as we discussed earlier, and I
certainly like the idea. My thoughts as of now are:

(1) Add a basic PSO optimizer to the existing optimization API (might refer
to the gradient descent optimizer code for help/coding style).

(2) Add support for contraint based optimization (the AugLagrangian class
has some support for equality constraints by calculating the associated
penalty and keeping it under a threshold; maybe a similar approach will
work here?).

(3) Extend the functionality to multi-objective optimization, this might
require reworking the Evaluate() methods of FunctionTypes to evaluate the
position for multiple objective functions; perhaps a better approach would
be to add another FunctionType which will be used by the MOPSO optimizer
(MultiObjectiveFunctionType?).

What are your thoughts about this? Especially regarding (2)?

I also had a small doubt: is there a minimum amount of RAM/resources I need
to have on my system? I am running a Fedora 27 on a Core i5 with 4GB of
RAM, and I cannot keep another application open while building the code (I
switched from the distributed tar file to a clone of the repo), not even
atom. Should I consider getting a RAM upgrade?

Thanks and regards,
Chintan
___
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack