The value of test should be 0x7ffffffe and is 0xfffffffe; 

Flags: none

(Also in 3.3.5)

#include <stdio.h>
signed char a=-4;
int test(){
        return (((unsigned int)(signed int) a ) / 2LL) ;
}

int main(void){
        int r;
        r=test();
        printf("test output:  %#x == %d      %x %x\n",r,r
                ,(r==0x7ffffffe),(r==0xfffffffe));
        if(r == ( ((unsigned int)(signed int) (signed char) -4 ) / 2LL ))
                printf("test successful\n");
        else
                printf("test failed\n");          
        return 0; 
}

-- 
           Summary: wrong code for arith.expr: (((unsigned int)(signed int)
                    a ) / 2LL) with signed char a=-4
           Product: gcc
           Version: 3.4.3
            Status: UNCONFIRMED
          Severity: normal
          Priority: P2
         Component: middle-end
        AssignedTo: unassigned at gcc dot gnu dot org
        ReportedBy: heinrich dot brand at fujitsu-siemens dot com
                CC: gcc-bugs at gcc dot gnu dot org
 GCC build triplet: Intel-Linux
  GCC host triplet: Intel-Linux
GCC target triplet: Intel-Linux


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19606

Reply via email to