https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78685

--- Comment #40 from Fredrik Tolf <fredrik at dolda2000 dot com> ---
>Assignments to local variables are a different topic, a bit less crucial for 
>debugging.

>such a resolution means that people will frequently end up using -O0 for 
>debugging, and O0 is far more expensive than it needs to be for viable 
>debugging.

I would like to add here a question about what -Og is actually supposed to
mean. The manpage says -Og is to "Optimize debugging experience". My naive
interpretation of this sentence is that -Og is supposed to offer "the best"
debugging experience, better than any other option. Currently, that's very
obviously not true, -O0 giving an objectively better debugging experience.

If -Og is supposed to be "the best balance" between performance and
debuggability, then that's fine, but then I think at least the documentation
should be updated to reflect this.


The manpage also states that -Og offers optimization "while  maintaining [...]
a good debugging experience". That too is a reasonable meaning of -Og, one that
I think is useful and should be preserved if possible. Right now, I don't
personally find that -Og offers a particularly good debugging experience (which
is why I always use -O0 instead), but I assume that's the core problem that
this bug is supposed to track, and which there is an intention to fix, one that
I think would be a good thing.


TL;DR: There seems to be some confusion in what -Og really is supposed to
achieve.

Reply via email to