http://gcc.gnu.org/bugzilla/show_bug.cgi?id=58143
--- Comment #12 from Bernd Edlinger <bernd.edlinger at hotmail dot de> --- (In reply to Jakub Jelinek from comment #11) > No, that is wrong as well. Because it is too destructive? Maybe. I think this is a general problem here. 1. the undefined behavior warning may be triggered by artefacts from the lim pass or in the class_48.f90 case. 2. surprise optimizations may happen without this warning, see my previous comment #9. 3. in the case of integer overflow, "reliable" does only say that the operation is executed in every iteration, but not that the result is acually used for something, as in Zhedong's example. With array bounds I have not the same problem, here as I'd say if the array is accessed beyond the limit, the guarantee is void anyway, and the lim pass would never move an array access out of the if statement, right? But there are examples where the undefined behavior warning is not emitted after a possible array bounds exception. A nice example for this is gmp-4.3.2/tests/mpz/t-scan.c This example has a array bounds error: static const int offset[] = { -2, -1, 0, 1, 2, 3 }; ... for (oindex = 0; oindex <= numberof (offset); oindex++) // +-1 error here { o = offset[oindex]; ... if (got != want) { ... exit (1); // this cancels the aggressive-loop-optimizations warning } ... } The generated code at -O2 is without the loop termination check, surprise surprise... What do you think?