On Sat, Jun 29, 2019 at 5:41 AM Paul Eggert <egg...@cs.ucla.edu> wrote: > Pip Cet wrote: > > eassume (global == 0); > > eassume (f ()); > > #else > > eassume (global == 0 && f ()); > > ... > > extern int global; > > > > int f(void) > > { > > return ++global; > > } > > > This is not a valid use of 'assume'. It's documented that assume's argument > should be free of side effects.
But the compiler makes no such assumption, so it cannot optimize assume(i >= 0 && f()), (where i is a global or a non-const pointer to i has potentially leaked) unless f happens to be available at optimization time. I think this is an interesting point: if GCC decided to add a __builtin_assume() builtin, we could give it slightly different semantics: that the expression passed to it evaluates to true, but doesn't evaluate to false or fail to evaluate. Something like __attribute__((does_return)) might do on a function. However, if I'm not too confused, we're discussing whether assume(SIMPLE_CONDITION && COMPLICATED_CONDITION) is ever a good idea. With the old assume, it's harmful. With the new assume, it's pointless. Is assume(SIMPLE_CONDITION); assume(COMPLICATED_CONDITION); a good idea? With the old assume, it's harmful. With the new assume, it's a more verbose way of simply assuming SIMPLE_CONDITION, so it might be a good idea. Also, "should" doesn't mean "must", does it? I'd prefer rewording that sentence as "R may or may not be evaluated: it should not normally have side-effects". > It would be nice if 'assume (R)' reported an error if R has side effects, and > generated a warning unless the compiler can verify that R is free of side > effects. However, these niceties would require better support from the > compiler. But if we had that support from the compiler, wouldn't it be even nicer to give up (most of) the distinction between assert and assume and just tell people to use assume? That idea was my starting point, and I'm still convinced it would result in better code overall. Except someone would have to grep a little once in a while and replace most eassume (A && B) expressions with eassume (A); eassume (B); However, there are a few tricks we can use to verify this in special debug builds. ------ Putting inner functions into sections: #define assume(C) ({ auto void inner (void) __attribute__((section("only_trivial_functions_go_here"), used)); void inner(void) { (void)(C); } (void) 0; }) Then verify that the section contains only trivial function definitions. (For the record, for Emacs, it does). Another approach for detecting when (C) has "global" side effects (such as calling an external function) is to do this: ------- #include <stdio.h> int global; extern void dummy(void (*)(void)) __attribute__((warning ("assume might have side effects"))); #define C printf("3") int main(void) { auto void inner(void) __attribute__((used)); void inner(void) { if (global) __builtin_unreachable (); (void)(C); if (global) dummy(inner); } } ----- A third possibility is to use __builtin_constant_p(!(C)!=!(C)), as the patch does. That doesn't state precisely that C has no side effects, but it does come fairly close in practice ... except for the inline function problem. > > If you want your program to behave predictably, in the strict sense, > > you cannot ever use the current assume() API. > > I'm not sure what you mean by "in the strict sense". It's true that programs > can > misuse 'assume' and get undefined behavior, but that's kind of the point.... Precisely what I meant.