On Wed, Sep 24, 2014 at 10:49 AM, RSmith <rsm...@rsweb.co.za> wrote:

> I'm trying to find what the limit is for dividing in terms of accuracy.
>
> Basically I have one program that inserts values to a table and determine
> sort order using one standard trick that has a REAL column named
> "SortOrder" which gets the value Highest_previous_value+1 if an insert
> happens with something that needs to be sorted at the end of the table.
>
> For any other position, the SortOrder gets assigned the value:
> ((Prev.Sortorder + Next.Sortorder) / 2)
>

{snipped}

A quick bit of test code shows me that after 53 iterations you'll run out
of precision, which makes sense because there are 53 mantissa bits in a
normalized double including the implicit leading 1 bit).

My quick & dirty test code which may be useful.

#include <stdio.h>

int main()
{
    double lo = 1.0;
    double hi = 2.0;

    int count = 0;

    while (lo != hi)
    {
        double mid = (lo + hi) / 2.0;
        printf("%d %f\n", ++count, mid);
        lo = mid;
    }

    return 0;
}

-- 
Scott Robison
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to