Hi Edward.

You have already convinced me that there are other options.

What I try to do here, is the give the 'availability' to the user, to
try different methods.

Here one method is: scipy.optimize.leastsq

The user function detailed describes, that this is by scipy.optimize.leastsq.

Anything which comes from that, can be tracked back to this function.

It is the same arguments, that one cannot change the published "model
function" in dispersion, but they stay as they are published.

Best
Troels


2014-08-25 11:43 GMT+02:00 Edward d'Auvergne <edw...@nmr-relax.com>:
> Hi,
>
> The scipy.optimize.leastsq() function contained a fatal bug back in
> 2003.  I reported it and it was not fixed the last time I checked,
> probably around 2008.  I wrote minfx to replace scipy.optimize due to
> the fatal bugs present in all 3 optimisation algorithms available back
> in 2003 (Levenberg-Marquardt, Nelder-Mead simplex, and one other).
> Using MINPACK means that they have changed the code for the LM
> algorithm, but due to the disregard the developers had for fixing
> fatal bugs back in the early days, there is not much someone can say
> to convince me that the current code is also not problematic.  Showing
> that scipy.optimize.leastsq() finds the correct result does not
> disprove this.  The fatal failures occur rarely and hence normally you
> don't see a problem.  But when the failure occurs, in case you are
> carefully checking you will never know.  This is standard for
> optimisation bugs, for example the fatal bug in Modelfree4 that I
> fixed and was released as version 4.20
> (http://www.palmer.hs.columbia.edu/software/modelfree.html).  This bug
> was not seen in 300+ publications from 1991 to 2006 (over 15 years),
> despite it affecting all these published results!
>
> Minfx is far more extensive and complete when compared to
> scipy.optimize.  And it is tested in the relax test suite.  The only
> feature scipy.optimize has that minfx does not is that it calls
> scipy.integrate.quadpack for numerical gradient estimates.  But
> deriving these yourself is basic.  For example using Quickmath, open
> http://www.quickmath.com/webMathematica3/quickmath/calculus/differentiate/advanced.jsp.
> Set the equation to:
>
> I*exp(-R*t)
>
> Set the variables to I first, then the differentiate button will tell
> you the df/dI partial derivative.  Set the variable to R,
> differentiate, then the result is the df/dR partial derivative.  The
> vector [df/dI, df/dR] is the gradient.  The Hessian is simple, just
> set the variables to [I, I], then [I, R], then [R, I], then [R, R].
> Put the result into a matrix and you have the Hessian.  I can't
> emphasis enough how basic these equations and solution are.
>
> Anyway, the optimisation solution in relax is minfx, not
> scipy.optimize.  All algorithms in scipy are available in minfx.  And
> the scipy ones have never been faster than the minfx ones.  The
> MINPACK LM algorithm might be slightly faster than the minfx LM
> algorithm, but this is not the reason for the speed ups you see!
>
> Regards,
>
> Edward
>
> On 25 August 2014 11:06, Troels Emtekær Linnet <tlin...@nmr-relax.com> wrote:
>> Hi Edward.
>>
>> This post continues from:
>> http://thread.gmane.org/gmane.science.nmr.relax.devel/6814
>>
>> As I wrote in this post, the new user function:
>> "relax_disp.r2eff_estimate()"
>>
>> gives freedom to the user to try other algorithms.
>> Here scipy.optimize.leastsq which should be (or is) a well known
>> option in the python community for minimisation.
>>
>> Did you find errors in MINPACK or in scipy ?
>>
>> Best
>> Troels
>>
>>
>> 2014-08-25 10:13 GMT+02:00 Edward d'Auvergne <edw...@nmr-relax.com>:
>>> Hi Troels,
>>>
>>> You should note that the 20x speed up is due to the change in
>>> optimisation algorithm rather than scipy vs. minfx vs. MINPACK.  The
>>> reason I wrote minfx is that I started with scipy optimisation but
>>> found all implemented algorithms contained fatal bugs.  This was not
>>> fixed for years after I reported it, and I don't know if the code had
>>> changed since 2003 when I looked into it.  Anyway a scipy optimisation
>>> solution is incompatible with the minfx optimisation solution in
>>> relax.  If you derive and code the gradients into relax, then you can
>>> use the minfx LM algorithm as a solution.
>>>
>>> Regards,
>>>
>>> Edward
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On 25 August 2014 01:08,  <tlin...@nmr-relax.com> wrote:
>>>> Author: tlinnet
>>>> Date: Mon Aug 25 01:08:44 2014
>>>> New Revision: 25228
>>>>
>>>> URL: http://svn.gna.org/viewcvs/relax?rev=25228&view=rev
>>>> Log:
>>>> Further improved the profiling of relax curve fit.
>>>>
>>>> This profiling shows, that Python code is about twice as slow as the C 
>>>> code implemented.
>>>>
>>>> But it also shows that optimising with scipy.optimize.leastsq is 20 X 
>>>> faster.
>>>> It also gives reasonable error values.
>>>>
>>>> Combining a function for a linear fit to guess the initial values, together
>>>> with scipy optimise, will be an extreme time win for estimating R2eff 
>>>> values fast.
>>>>
>>>> A further test would be to use relax Monte-Carlo simulations for say 
>>>> 1000-2000 iterations,
>>>> and compare to the errors extracted from estimated covariance.
>>>>
>>>> Added:
>>>>     trunk/test_suite/shared_data/curve_fitting/profiling/relax_fit.py
>>>> Modified:
>>>>     
>>>> trunk/test_suite/shared_data/curve_fitting/profiling/profiling_relax_fit.py
>>>>
>>>> [This mail would be too long, it was shortened to contain the URLs only.]
>>>>
>>>> Modified: 
>>>> trunk/test_suite/shared_data/curve_fitting/profiling/profiling_relax_fit.py
>>>> URL: 
>>>> http://svn.gna.org/viewcvs/relax/trunk/test_suite/shared_data/curve_fitting/profiling/profiling_relax_fit.py?rev=25228&r1=25227&r2=25228&view=diff
>>>>
>>>> Added: trunk/test_suite/shared_data/curve_fitting/profiling/relax_fit.py
>>>> URL: 
>>>> http://svn.gna.org/viewcvs/relax/trunk/test_suite/shared_data/curve_fitting/profiling/relax_fit.py?rev=25228&view=auto
>>>>
>>>>
>>>> _______________________________________________
>>>> relax (http://www.nmr-relax.com)
>>>>
>>>> This is the relax-commits mailing list
>>>> relax-comm...@gna.org
>>>>
>>>> To unsubscribe from this list, get a password
>>>> reminder, or change your subscription options,
>>>> visit the list information page at
>>>> https://mail.gna.org/listinfo/relax-commits
>>>
>>> _______________________________________________
>>> relax (http://www.nmr-relax.com)
>>>
>>> This is the relax-devel mailing list
>>> relax-devel@gna.org
>>>
>>> To unsubscribe from this list, get a password
>>> reminder, or change your subscription options,
>>> visit the list information page at
>>> https://mail.gna.org/listinfo/relax-devel

_______________________________________________
relax (http://www.nmr-relax.com)

This is the relax-devel mailing list
relax-devel@gna.org

To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-devel

Reply via email to