Hi Gilles and Randy,

that sounds really interesting I'll try this out, both the sigmodial and the scaling stuff!

I'm relative new to all this least squares things and I'm not a mathematician so I need to learn a lot, so don't hesitate to point out things that might seem obvious to the advanced user, that's really welcome :-)

Am 08.06.21 um 15:23 schrieb Gilles Sadowski:
Hello.

Le mar. 8 juin 2021 à 08:14, Christoph Läubrich
<[email protected]> a écrit :

Hi Gilles,

I have used the the INFINITY approach for a while now and it works quite
good. I just recently found a problem where I got very bad fits after a
handful of iterations using the LevenbergMarquardtOptimizer.

The problem arises whenever there is a relative small range of valid
values for *one* parameter.

Too keep up with the Gausian example assume that the mean is only valid
in a small window, but norm and sigma are completely free.

My guess is, if in the data there are outlier that indicates a strong
maximum outside this range the optimizer try to go in that 'direction'
because I reject this solution it 'gives up' as it seems evident that
there is no better solution. This then can result in a gausian that is
very thin and a really bad fit (cost e.g about 1E4).

If I help the optimizer (e.g. by adjusting the initial guess of sigma)
it finds a much better solution (cost about 1E-9).

So what I would need to tell the Optimizer (not sure if this is possible
at all!) that not the *whole* solution is bad, but only the choice of
*one* variable so it could use larger increments for the other variables.

If you want to restrict the range of, say, the mean:

public class MyFunc implements ParametricUnivariateFunction {
     private final Sigmoid meanTransform;

     public MyFunc(double minMean, double maxMean) {
         meanTransform = new Sigmoid(minMean, maxMean);
     }

     public double value(double x, double ... param) {
          final double mu = meanTransform.value(param[1]); // param[1]
is the mean.
          final double diff = x - mu;
          final double norm = param[0]; // param[0] is the height.
          final double s = param[2]; // param[2] is the standard deviation.
          final double i2s2 = 1 / (2 * s * s);
          return Gaussian.value(diff, norm, i2s2);
     }
}

// ...
final MyFunc f = new MyFunc(min, max);
final double[] best = fitter.fit(f); // Perform fit.
final double bestMean = new Logit(min, max).value(best[1]);

HTH,
Gilles


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to