What do you mean?

I run the following test:

        double[] design = new double[]{
            60323, 83.0, 234289, 2356, 1590, 107608, 1947,
            61122, 88.5, 259426, 2325, 1456, 108632, 1948,
            60171, 88.2, 258054, 3682, 1616, 109773, 1949,
            61187, 89.5, 284599, 3351, 1650, 110929, 1950,
            63221, 96.2, 328975, 2099, 3099, 112075, 1951,
            63639, 98.1, 346999, 1932, 3594, 113270, 1952,
            64989, 99.0, 365385, 1870, 3547, 115094, 1953,
            63761, 100.0, 363112, 3578, 3350, 116219, 1954,
            66019, 101.2, 397469, 2904, 3048, 117388, 1955,
            67857, 104.6, 419180, 2822, 2857, 118734, 1956,
            68169, 108.4, 442769, 2936, 2798, 120445, 1957,
            66513, 110.8, 444546, 4681, 2637, 121950, 1958,
            68655, 112.6, 482704, 3813, 2552, 123366, 1959,
            69564, 114.2, 502601, 3931, 2514, 125368, 1960,
            69331, 115.7, 518173, 4806, 2572, 127852, 1961,
            70551, 116.9, 554894, 4007, 2827, 130081, 1962
        };

        final int nobs = 16;
        final int nvars = 6;

        // Estimate the model
        MillerUpdatingRegression model = new MillerUpdatingRegression(6,
true, MathUtils.SAFE_MIN);
        int off = 0;
        double[] tmp = new double[6];
        for (int i = 0; i < nobs; i++) {
            System.arraycopy(design, off + 1, tmp, 0, nvars);
            model.addObservation(tmp, design[off]);
            off += nvars + 1;
        }

        // Check expected beta values from NIST
        RegressionResults result = model.regress();
        double[] betaHat = result.getParameterEstimates();
        TestUtils.assertEquals(betaHat,
                new double[]{-3482258.63459582, 15.0618722713733,
                    -0.358191792925910E-01, -2.02022980381683,
                    -1.03322686717359, -0.511041056535807E-01,
                    1829.15146461355}, 1E-6); //



The regression technique I am adding has parameters that are within 1.0e-6
of the certified values. OLSMultipleLinearRegressionTest is within 2.0e-8.



On Mon, Jul 11, 2011 at 11:32 PM, Ted Dunning <ted.dunn...@gmail.com> wrote:

> Can you point at code?
>
> On Mon, Jul 11, 2011 at 9:07 PM, Greg Sterijevski <gsterijev...@gmail.com
> >wrote:
>
> > Yes, my apologies. I am a bit new to this.
> >
> >
> > On Mon, Jul 11, 2011 at 10:59 PM, Henri Yandell <flame...@gmail.com>
> > wrote:
> >
> > > I'm assuming this is Commons Math. I've added a [math] so it catches
> > > the interest of those involved.
> > >
> > >
> > > On Mon, Jul 11, 2011 at 8:52 PM, Greg Sterijevski
> > > <gsterijev...@gmail.com> wrote:
> > > > Additionally, I pass all of the Wampler beta estimates.
> > > >
> > > > On Mon, Jul 11, 2011 at 10:40 PM, Greg Sterijevski
> > > > <gsterijev...@gmail.com>wrote:
> > > >
> > > >> Hello All,
> > > >>
> > > >> I am testing the first 'updating' ols regression algorithm. I ran it
> > > >> through the Wampler1 data. It gets 1.0s for all of the beta
> estimates.
> > I
> > > >> next ran the Longley dataset. I match, but with a tolerance of
> 1.0e-6.
> > > This
> > > >> is a bit less than two orders of magnitude worse than the current
> > incore
> > > >> estimator( 2.0e-8). My question to the list, is how important is
> this
> > > diff?
> > > >> Is it worth tearing things apart to figure out where the error is
> > > >> accumulating?
> > > >>
> > > >> Thanks,
> > > >>
> > > >> -Greg
> > > >>
> > > >
> > >
> > > ---------------------------------------------------------------------
> > > To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
> > > For additional commands, e-mail: dev-h...@commons.apache.org
> > >
> > >
> >
>

Reply via email to