On Fri, Mar 24, 2017 at 12:45:28PM +0100, Richard Biener wrote:
> On Fri, Mar 24, 2017 at 9:43 AM, Jakub Jelinek <[email protected]> wrote:
> > On Fri, Mar 24, 2017 at 09:29:00AM +0100, Richard Biener wrote:
> >> Yeah, the thing BLOCK_NONLOCALIZED_VARS wants to do is optimize generated
> >> dwarf by adding a DW_AT_abstract_origin (just to refer to the
> >> subprogram DIE) but
> >
> > Well, for FUNCTION_DECLs in BLOCK_VARS/BLOCK_NONLOCALIZED_VARS we actually
> > don't
> > emit any further DIE and so there is no DW_AT_abstract_origin.
> > E.g. gen_subprogram_die has:
> > /* Detect and ignore this case, where we are trying to output
> > something we have already output. */
> > if (get_AT (old_die, DW_AT_low_pc)
> > || get_AT (old_die, DW_AT_ranges))
> > return;
>
> Hmm, but we do want to put the function in scope? THus
>
> void foo () {}
> void bar ()
> {
> int foo;
> {
> void foo();
> foo();
> }
> }
>
> should have a DIE for foo in bar (possibly refering to the concrete instance
> for optimization).
We actually do that. If I change your testcase so that it actually triggers
the changed part of the code (so there is inlining etc.), so:
volatile int v;
__attribute__((noinline)) void foo () { v++; }
static inline void bar ()
{
int foo;
{
void foo();
foo();
}
}
void
baz (void)
{
bar ();
bar ();
}
then at -gdwarf-3 -dA -O2 the difference between vanilla GCC and my patched GCC
is
following (used -gdwarf-3 so that there are no DW_FORM_flag_present that
result in DIE offset changes):
.uleb128 0xd # (DIE (0x117) DW_TAG_subprogram)
.ascii "bar\0" # DW_AT_name
.byte 0x1 # DW_AT_decl_file (prQQ.c)
.byte 0x3 # DW_AT_decl_line
.byte 0x3 # DW_AT_inline
.long 0x13c # DW_AT_sibling
.uleb128 0xe # (DIE (0x123) DW_TAG_variable)
.ascii "foo\0" # DW_AT_name
.byte 0x1 # DW_AT_decl_file (prQQ.c)
.byte 0x5 # DW_AT_decl_line
.long 0x41 # DW_AT_type
.uleb128 0xf # (DIE (0x12e) DW_TAG_lexical_block)
.uleb128 0x10 # (DIE (0x12f) DW_TAG_subprogram)
.byte 0x1 # DW_AT_external
.ascii "foo\0" # DW_AT_name
.byte 0x1 # DW_AT_decl_file (prQQ.c)
.byte 0x7 # DW_AT_decl_line
- .byte 0 # DW_AT_inline
+ .byte 0x1 # DW_AT_declaration
.uleb128 0x11 # (DIE (0x138) DW_TAG_unspecified_parameters)
.byte 0 # end of children of DIE 0x12f
.byte 0 # end of children of DIE 0x12e
.byte 0 # end of children of DIE 0x117
.uleb128 0x12 # (DIE (0x13c) DW_TAG_subprogram)
.byte 0x1 # DW_AT_external
.ascii "foo\0" # DW_AT_name
.byte 0x1 # DW_AT_decl_file (prQQ.c)
.byte 0x2 # DW_AT_decl_line
.quad .LFB0 # DW_AT_low_pc
.quad .LFE0 # DW_AT_high_pc
.byte 0x1 # DW_AT_frame_base
.byte 0x9c # DW_OP_call_frame_cfa
.byte 0x1 # DW_AT_GNU_all_call_sites
.byte 0 # end of children of DIE 0xb
and corresponding change in .debug_abbrev.
That seems to be the right change to me, inside of bar (the DW_AT_inline
one) we have a declaration of foo, and we properly use DW_AT_declaration for
it, then at the toplevel we actually have the full definition die for foo,
and then we have in two spots inside baz
.uleb128 0x6 # (DIE (0x6d) DW_TAG_inlined_subroutine)
.long 0x117 # DW_AT_abstract_origin
without the need to duplicate the foo declaration in there,
that is inherited through DW_AT_abstract_origin.
Jakub