Dear professor Michael Creel, the other time you advised me to use the functions mle _estimate and mle_example in order to calculate the standard deviations and given my large number of parameters to estimate, do these two functions incorporate also the "Limited memory" to both benefit from the speed of convergence and the standard deviations of parameters. Can you please refer me to documents that explain the algorithm samin (Preferably these materials are not very complicated). Best regards.
Michael Creel wrote: > > On Sun, Jan 17, 2010 at 1:01 AM, george brida <george.br...@gmail.com> > wrote: >> Hi everybody, >> I would like to know the average duration of convergence of the function >> "bfgsmin" to final estimates. I have a model that has 32 parameters to >> estimate ( my problem is a nonlinear regression one). I made four >> functions >> used in the main program that is obviously to use the function bfgsmin. >> The >> problem I have left about 24 hours and the program is still running. I >> wonder whether this term is acceptable and if with this large number of >> parameters to estimate, the program may converge and I hope that >> converges >> as this program is part of the last part of my phd thesis. >> Thank you very much for your help. >> George. >> > > Hi George, > The time to convergence depends very much on the problem and on the > starting values. Thing that can cause lack of convergence are: > * the existence of multiple local minima > * the existence of flat spots, saddle points, etc. > * loss of precision in computations > * poor scaling of parameters so that the elements of the gradient have > markedly different magnitudes > * poor starting values > * also, the default convergence tolerances for bfgsmin are quite > strict, it might be advisable to loosen them for a problem with this > many parameters. > > For a problem with more than 30 parameters, you might try the limited > memory option, it very well could be faster. Also, you could try using > the simulated annealing algorithm (samin) to find pretty good start > values, and then use bfgsmin to sharpen up the results. > > If you are supplying the analytic gradient, you should verify that you > have programmed it correctly by comparing it to numeric derivatives. I > had to repeat a number of computations in my own thesis due to > incorrect analytic derivatives :-) > > Good luck, Michael > > ------------------------------------------------------------------------------ > Throughout its 18-year history, RSA Conference consistently attracts the > world's best and brightest in the field, creating opportunities for > Conference > attendees to learn about information security's most important issues > through > interactions with peers, luminaries and emerging and established > companies. > http://p.sf.net/sfu/rsaconf-dev2dev > _______________________________________________ > Octave-dev mailing list > Octave-dev@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/octave-dev > > -- View this message in context: http://old.nabble.com/average-duration-of-convergence-of-the-function-%22bfgsmin%22-tp27195007p27204594.html Sent from the octave-dev mailing list archive at Nabble.com. ------------------------------------------------------------------------------ Throughout its 18-year history, RSA Conference consistently attracts the world's best and brightest in the field, creating opportunities for Conference attendees to learn about information security's most important issues through interactions with peers, luminaries and emerging and established companies. http://p.sf.net/sfu/rsaconf-dev2dev _______________________________________________ Octave-dev mailing list Octave-dev@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/octave-dev