https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78685

--- Comment #38 from Lukas Grätz <[email protected]> ---
(In reply to Eric Gallager from comment #37)
> (In reply to Richard Sandiford from comment #36)
> > Regarding the subject change to add "it should generate artificial uses for
> > variables at end of scope to avoid this" [...]
> 
> My purpose in retitling was primarily just to make the issue easier to find
> again, next time Jakub mentions it, since that's the solution he usually
> proposes.

If you try to debug code compiled with -Og, you run into the described
"<optimized out>" problem. And then, in the gcc bugtracker, you would search
for "Og optimized out" which gives you just 10 results. I see no issue with the
search function of the bugtracker and I also see no need to abuse the title to
make it "easier to find".

> > [..] It would keep live the subset of assigned values that can reach the 
> > end of
> > the scope, and so would be better than the status quo, but it wouldn't help 
> > with
> > other assignments.
> > 
> > Also, keeping the value live at the end of the scope guarantees that the
> > executable code can calculate the value at that particular point in the
> > program.  It doesn't (IIUC) guarantee that a debugger can calculate the
> > correct value at all intermediate points, even with var-tracking.  It's more
> > about increasing the chances.

To be fair, the "artificial uses for variables" should prevent "<optimized
out>" for function call arguments along the call stack, and this seems
plausible. Assignments to local variables are a different topic, a bit less
crucial for debugging. I still think the best way to resolve the present issue
is to deprecate the -Og option.

Reply via email to