-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 01/22/2014 05:14 PM, Julius Ziegler wrote:
> On 01/22/2014 05:07 PM, Tobias Schmidt wrote:
>>> It looks like you are basing the difference quotient on the
>>> last value of x passed into the function ("x_pred"). I do not
>>> know if this is a good idea.
> 
>> I assume that the step size is small enough to calculate the 
>> difference quotient backwards. Maybe this assumption is a bad
>> idea (it was the first I had). Btw., I corrected the wrong
>> calculation of the differences, but it does not run anyway. Your
>> advice looks smart; I will try it that way.
> 
> I think the problem is that the gradient you return on the first 
> iteration - when there is no valid x_pred - is 0 (is that so? I
> have not checked in depth).
> 
> Now imagine you are doing a Newton step based on a 0 gradient. You 
> would require an infinite step in some direction (the gradient is 
> actually in the denominator). Just look at the nice animation for
> the 1d-case here (and keep in mind that in Newton type methods, we
> are actually looking for the root of the gradient): 
> http://en.wikipedia.org/wiki/Newton%27s_method

Corr.!

I was a bit wrong here. In the denominator is actually the second
derivative (of the objective function).

However, if you return a gradient 0 (for the objective function), the
optimizer will sure go into some termination routine, because it
thinks that it has arrived at a stationary point.

Might be less problematic for the constraint gradients, but I would
not bet...

Julius


> Best, Julius
> 
> 
>>> Another thing: is the equivalent of your objective function 
>>> myfun available in matlab? Have you checked if it is
>>> implemented correctly?
> 
>> Yes, the objective function is equivalent to the objective
>> function in the matlab code.
> 
>> I guess that at least one of the constraint functions causes the 
>> fault, but I do not know why!
> 
>> Regards, Tobias
> 
>> Am 22.01.2014 16:18, schrieb Julius Ziegler: On 01/22/2014 04:09 
>> PM, Tobias Schmidt wrote:
>>>>> Oh, you're right. I forgot that. But even if I comment the
>>>>>  zeroconfun constraint, the fault occurs. The fault occurs
>>>>> if _any_ of my constraint functions is added. Without
>>>>> these functions the optimization returns successfully.
> 
>> Your way of computing the gradients looks strange. It looks like 
>> you are basing the difference quotient on the last value of x 
>> passed into the function ("x_pred"). I do not know if this is a 
>> good idea.
> 
>> Why dont you just calulate it like this (this is not working 
>> c-code, just for explanation). Use h "small", where small is 
>> dependent on your domain knowledge.
> 
>> const h = 0.0001;
> 
>> f0=myfun(x);
> 
>> for all i: x1=x; x1[i]+=h; f1=myfun(x1); grad[i]=(f1-f0)/h;
> 
>> return f0
> 
>> Another thing: is the equivalent of your objective function myfun
>>  available in matlab? Have you checked if it is implemented 
>> correctly?
> 
>> Best, Julius
> 
>>>>> Regards, Tobias
>>>>> 
>>>>> 
>>>>> Am 22.01.2014 15:30, schrieb Julius Ziegler: On 01/22/2014 
>>>>> 02:59 PM, Tobias Schmidt wrote:
>>>>>>>> Sure! Thank you for investing your time ;-).
>>>>> 
>>>>> You are not computing the gradient of zeroconfun, but it 
>>>>> looks like you are using it as a constraint. Have you 
>>>>> overlooked it?
>>>>> 
>>>>> Best, Julius
>>>>> 
>>>>>>>> Regards, Tobias
>>>>>>>> 
>>>>>>>> Am 22.01.2014 14:23, schrieb Julius Ziegler: Could
>>>>>>>> you attach your updated code, please?
>>>>>>>> 
>>>>>>>> Best, Julius
>>>>>>>> 
>>>>>>>> On 01/22/2014 01:45 PM, Tobias Schmidt wrote:
>>>>>>>> 
>>>>>>>> _______________________________________________ 
>>>>>>>> NLopt-discuss
>>>>>>>> 
>>>>>>>> _______________________________________________ 
>>>>>>>> NLopt-discuss
>>>>>>>> 
>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> _______________________________________________ 
>>>>>>>>> NLopt-discuss mailing list 
>>>>>>>>> [email protected] 
>>>>>>>>> http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>>
>
>>>>>>>>> 
>>>>>>>>> 
> _______________________________________________ NLopt-discuss
>>>>>>>> mailing list [email protected] 
>>>>>>>> http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
>>>>>>>>
>>>>>
>>>>>>>>
>>>>>>
>>>>>>
>>>>>>>>
>
>>>>>>>> 
_______________________________________________ NLopt-discuss
>>>>>> mailing list [email protected] 
>>>>>> http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
>>>>>>
>>>>>
>
>>>>>> 
>>>>>> 
>>> 
>>> _______________________________________________ NLopt-discuss 
>>> mailing list [email protected] 
>>> http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
>>>
>
>>> 
> 
> 
> 
> 
> _______________________________________________ NLopt-discuss
> mailing list [email protected] 
> http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss
> 

- -- 
Dipl.-Inform. Julius Ziegler <[email protected]>

Institut für Mess- und Regelungstechnik
Karlsruher Institut für Technologie

Department of Measurement and Control
Karlsruhe Institute of Technology

Engler-Bunte-Ring 21
76131 Karlsruhe

Tel. +49 721 608 47146

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/

iQEcBAEBAgAGBQJS3++ZAAoJEMXuNLCiya8LMPgH/i4f86xnSC9h3APRo3EhynKs
oe7gga2AJ8gqTxiytnRNDVDsQcICbRGJgD4tkdWcTmGX2ufUtJ0D7dcW5ELr2jDg
MirHg/D2UAvPV+Ilxvv7yGoLqvT/6rlc42vJtC54SE4xunCQW/uPS1e/8Ot1W16R
JXO45x69dM4r9P3d4wUSNTZBBgyeg/h6l0iCudkNtNpW5XEp4aFRqKpHfPneFaS7
nBk7lljLZBUjpeYXdARDMEhtwa85B8yfyBPQE0YUaXAAMC+cIRsFlBIWW0bKn8Mc
gXPFQmWRYN9Ijjt7Sd3+8Ps7NRd8yjxaBLeq5w3ve7uD3wzo59LtGoS0ShVEm3c=
=pe2G
-----END PGP SIGNATURE-----

_______________________________________________
NLopt-discuss mailing list
[email protected]
http://ab-initio.mit.edu/cgi-bin/mailman/listinfo/nlopt-discuss

Reply via email to