https://gcc.gnu.org/bugzilla/show_bug.cgi?id=111689

            Bug ID: 111689
           Summary: Unexpected comparison result of signed long long
                    literal with zero
           Product: gcc
           Version: 14.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: translation
          Assignee: unassigned at gcc dot gnu.org
          Reporter: guminb at ajou dot ac.kr
  Target Milestone: ---

I've come across an unexpected behavior when performing a comparison between
the value `0` and the signed long long literal `0x8F3700142F89C2A5LL`.

#include<stdio.h>
#include<stdint.h>

int main (int argc, char* argv[])
{   
    long long zero = 0;
    printf("Comparison result of zero <= signed long long 0x8F3700142F89C2A5LL:
%d\n", (zero <= 0x8F3700142F89C2A5LL));
    return 0;
}

The binary representation of 0x8F3700142F89C2A5LL is:
1000111100110111000000000001010000101111100010011100001010100101

This corresponds to the decimal value -8127026915869867355, indicating that it
is unambiguously a negative number.

Considering that 0x8F3700142F89C2A5LL is meant to be interpreted as a signed
long long and represents a negative value, the expected result for (zero <=
0x8F3700142F89C2A5LL) should be 0. However, the actual result is 1 across
several compilers including gcc and clang.

Interestingly, the cl compiler correctly interprets ULL as unsigned and LL as
signed for the given literal.

It appears that the compiler is not treating the literal 0x8F3700142F89C2A5LL
as a signed long long as it should, which results in this discrepancy.

I'm seeking understanding or clarification regarding this behavior. Would it be
possible for this to be a compiler bug?

Reply via email to