On Fri, Jul 06, 2012 at 09:39:30PM +0000, Italo Maia wrote: > > Hummm, so my assumption that my previous values for a, b and c were the best > are wrong. I calculated the resid and it is really smaller. Real thanks for > that!
I wouldn't take the difference too seriously, given that the data are not really close to the curve. The errors seem quite large. > Any tips on calculating the r-squared? No. Gilles > > Date: Fri, 6 Jul 2012 22:05:26 +0200 > From: [email protected] > To: [email protected] > Subject: Re: [math] > > Hi. > > If you are using the function > > a * Math.pow(t, b) * Math.exp(-c * t) > > the gradient is: > > { Math.pow(t, b) * Math.exp(-c * t), > a * Math.log(t) * Math.pow(t, b) * Math.exp(-c * t), > -a * t Math.pow(t, b) * Math.exp(-c * t) } > > > // No idea what goes here. Nothing seems to work. > > Well, the gradient (partial derivatives w.r.t the parameters) is the thing > that will work; the attached figure shows the data and the function that > fits it with > a = 1.097378664278161 > b = 0.4273818336149512 > c = 0.01457006142420487 > > > > > a, b and c for this example should be: A: 1.0782 B: 0.4583 C: 0.0166 > > The fit is slightly better with the values found by "CurveFitter" > (the "LevenbergMarquardt" algorithm actually). > > Regards, > Gilles > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
