Hi Grey, and others,

I wish to consider a whole family of optimization problems which are defined as follows:

the objective function f: [0,1]^n -> R
the n parameters are partitioned in m sets so that each set of parameters represents a distribution of probability. So, for a very small example, let us suppose that n = 3, m = 1, and parameters are denoted a, b, and c. Then, we have the constraint that a+b+c=1.

Performing a gradient algorithm, we need the gradient of f wrt each parameter. However, the gradients are not independent: to keep this a+b+c=1 relationship, df/da is not independent of df/db and df/dc. Simply updating one parameter (say a) using its gradient (df/da) is not correct: the space of parameters is not Euclidian because the variation of one parameter involves some variation on the other 2, to keep this a+b+c=1 constraint valid.

So my question is whether the algorithms available in nlopt take care of this. I doubt it but I'd like to be sure.
Then, the next question is: how to take care of this relationship?

Thanks a lot,

Philippe

On 25/01/2017 18:52, Grey Gordon wrote:
Hi Philippe,

Is your problem to min_{a,b,c} f(a,b,c) s.t. a+b+c=1 for f:R^3 -> R? Do you 
mean your function is non-Euclidean because it is mapping to some space other than 
R?

Perhaps more concretely explaining your problem would help.

Best,
Grey


On Jan 25, 2017, at 11:18 AM, philippe preux<[email protected]>  wrote:

Hi,
I am optimizing a differentiable function defined over a probability 
distribution. That is, say the function to optimize has 3 parameters a, b and c 
each being a probability and such that a + b + c = 1.
We know that optimizing each parameter independently from the other 2 is not 
the best way to go as we do not take the a+b+c=1 constraint into consideration. 
The solution is not to add this constraint to the problem via an equality 
constraint; the issue is that the space is not Euclidian and that whenever one 
computes the gradient wrt to a parameter (say a), the 2 others should also be 
considered, to take the shape of the manifold on which I optimize into 
consideration. It seems to me that directional derivatives, or natural 
gradients are needed here.
So my question is: how to deal with such non Euclidian spaces with nlopt?
Thanks for any help,
Philippe


_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss


_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to