http://gcc.gnu.org/bugzilla/show_bug.cgi?id=45779

           Summary: pointer difference error/ptrdiff_t representability
           Product: gcc
           Version: 4.6.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: c
        AssignedTo: unassig...@gcc.gnu.org
        ReportedBy: akla...@rumms.uni-mannheim.de


Note: initially found on gcc 4.3.2, confirmed on 4.6.0 20100924 from svn.

Consider the following program:
<code>
/* test.c */
#include<assert.h>
#include<inttypes.h>
#include<stddef.h>
#include<stdio.h>
#include<stdlib.h>

int main(int argc, char ** argv) {
        printf("ptrdiff_t max: %ju, size_t max: %ju\n", (uintmax_t)
PTRDIFF_MAX, (uintmax_t) SIZE_MAX);
        assert (argc > 1);
        size_t size = atoll(argv[1]);
        printf("requested array size: %zu\n", size);
        assert (size > 0);
        uint16_t * array = malloc(size * sizeof(*array));
        assert (array != NULL);
        printf("array one-past-end/start difference: %td\n",
                        &array[size] - &array[0]);
}
</code>

$ gcc -std=c99 -pedantic -Wall -Wextra test.c
$ ./a.out 1200000000
ptrdiff_t max: 2147483647, size_t max: 4294967295
requested array size: 1200000000
array one-past-end/start difference: -947483648

The output "-947483648" violates the C99 standard, it should be "1200000000".

This program was compiled and run on an IA-32 host with 2.5 GiB memory.

The pointer returned by the successful call to malloc() points to an array of
1200000000 uint16_t's. In the present case, the number 1200000000 is smaller
than PTRDIFF_MAX and thus representable by the ptrdiff_t type. Hence, by the
C99 standard, 6.5.6p9, the expression &array[size] - &array[0] above is defined
to have type ptrdiff_t and value 1200000000.

Note that if one replaced uint16_t with char in the above code and called the
program with argument 2400000000 (a number larger than PTRDIFF_MAX), the
behaviour would be undefined. Therefore I suspect that, internally, gcc first
calculates the value of &array[size] - &array[0] as if array had type
pointer-to-char and then erroneously interprets the result as a negative 32-bit
2's complement signed integer, which it then divides by 2 (that is,
sizeof(uint16_t)) with a signed integer division.

Best regards,
Alexander

-- 
Configure bugmail: http://gcc.gnu.org/bugzilla/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.

Reply via email to