A full working example attached.
> Date: Fri, 6 Jul 2012 11:53:05 +0200
> From: [email protected]
> To: [email protected]
> Subject: Re: [math]
>
> On Thu, Jul 05, 2012 at 10:02:29PM +0000, Italo Maia wrote:
> >
> > Oh my. Fair enough. Here is a sample data.
> >
> > http://pastebin.com/MkQrE8d2
>
> See below.
>
> >
> > The values of a, b and c for this sample data, for best fitting, are:
> > A: 1.0782 B: 0.4583 C: 0.0166
> >
> > When everything is working, I'll publish something about the code.
> > CurveFitter seems very devoided of love.
> >
> > > Date: Thu, 5 Jul 2012 23:52:31 +0200
> > > From: [email protected]
> > > To: [email protected]
> > > Subject: Re: [math]
> > >
> > > Hello.
> > >
> > > On Thu, Jul 05, 2012 at 09:19:17PM +0000, Italo Maia wrote:
> > > >
> > > > Here you go: http://pastebin.com/UR0GV7ST
> > > >
> > >
> > > I'd think that it would be better not to use such a site, since it seems
> > > that the contents will be removed at some point, leading to this thread
> > > being impossible to follow in the archive.
>
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> Here.
>
> > > [Maybe other people on the ML could give their opinion on this aspect.]
> > >
> > > The subject of this thread is not very clear either. :-}
> > >
> > > >
> > > > Unfortunatly I can't provide the matrix data. : /
> > >
> > > So, how am I supposed to know what is going on?
> > > Clearly if you define the "gradient" method as on the above page, it
> > > cannot
> > > work.
> > >
> > > Please provide, in an attached file, a working example, showing what you
> > > tried and what result you obtained.
>
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> And here.
>
>
> Thanks,
> Gilles
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
import org.apache.commons.math3.optimization.fitting.CurveFitter;
import org.apache.commons.math3.analysis.ParametricUnivariateFunction;
import org.apache.commons.math3.optimization.general.LevenbergMarquardtOptimizer;
class Fnc{
public static double fnc(double t, double a, double b, double c){
return a * Math.pow(t, b) * Math.exp(-c * t);
}
public static double fnc_log(double t, double a, double b, double c){
return Math.log(a) + b * Math.log(t) - c * t;
}
public static double[] gradient(double t, double ... params){
double a = params[0];
double b = params[1];
double c = params[2];
return new double[]{
Math.pow(t, b) * Math.exp(-c*t),
a * Math.pow(t, b) * Math.exp(b-c*t),
-c * a*Math.pow(t, b) * Math.exp(-c*t)
}; // No idea what goes here. Nothing seems to work.
}
}
class PUF implements ParametricUnivariateFunction{
@Override
public double[] gradient(double x, double... params) {
return Fnc.gradient(x, params);
}
@Override
public double value(double x, double... params) {
return Fnc.fnc(x, params[0], params[1], params[2]);
}
}
public class Least{
static double [][] data = {
{ 3, 2.20 },
{ 5, 2.10 },
{ 8, 2.50 },
{ 12, 2.50 },
{ 15, 2.40 },
{ 19, 2.50 },
{ 22, 2.60 },
{ 24, 2.90 },
{ 26, 3.00 },
{ 28, 3.01 },
{ 33, 3.00 },
{ 35, 4.10 },
{ 37, 3.50 },
{ 40, 2.87 },
{ 43, 2.43 },
{ 47, 2.52 },
{ 50, 2.12 },
{ 54, 2.96 },
{ 57, 2.50 },
{ 61, 3.70 },
{ 64, 2.70 },
{ 68, 2.50 },
{ 71, 1.90 },
{ 79, 2.20 },
{ 82, 2.50 },
{ 85, 2.20 },
{ 89, 2.70 },
{ 93, 2.20 },
{ 96, 2.00 },
{ 101, 1.90 },
{ 104, 1.40 },
{ 107, 1.60 },
{ 111, 1.50 },
{ 114, 2.20 },
{ 118, 1.50 },
{ 125, 1.10 },
{ 132, 0.90 },
{ 139, 0.70 },
{ 146, 0.80 },
};
public static void main(String... args){
int n = data.length;
LevenbergMarquardtOptimizer optimizer = new LevenbergMarquardtOptimizer();
PUF puf = new PUF();
CurveFitter fitter = new CurveFitter(optimizer);
double[] guess = new double[]{1, 0.5, 0.5};
for(int i=0;i<n;i++)
fitter.addObservedPoint(data[i][0], data[i][1]);
double[] bestfit = fitter.fit(puf, guess);
for(int i=0;i<bestfit.length;i++)
System.out.println(bestfit[i]);
}
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]