https://gcc.gnu.org/bugzilla/show_bug.cgi?id=107890
Bug ID: 107890 Summary: UB on integer overflow impacts code flow Product: gcc Version: 12.2.0 Status: UNCONFIRMED Severity: normal Priority: P3 Component: c Assignee: unassigned at gcc dot gnu.org Reporter: gcc at pkh dot me Target Milestone: --- Following is a code that is sensible to a signed integer overflow. I was under the impression that this kind of undefined behavior essentially meant that the value of that integer could become unreliable. But apparently this is not limited to the value of said integer, it can also dramatically impact the code flow. Here is the pathological code: #include <stdint.h> #include <stdio.h> #include <stdlib.h> uint8_t tab[0x1ff + 1]; uint8_t f(int32_t x) { if (x < 0) return 0; int32_t i = x * 0x1ff / 0xffff; if (i >= 0 && i < sizeof(tab)) { printf("tab[%d] looks safe because %d is between [0;%d[\n", i, i, (int)sizeof(tab)); return tab[i]; } return 0; } int main(int ac, char **av) { return f(atoi(av[1])); } Triggering an overflow actually enters the printf/dereference scope, violating the protective condition and thus causing a crash: % cc -Wall -O2 overflow.c -o overflow && ./overflow 50000000 tab[62183] looks safe because 62183 is between [0;512[ zsh: segmentation fault (core dumped) ./overflow 50000000 I feel extremely uncomfortable about an integer overflow actually impacting something else than the integer itself. Is it expected or is this a bug?