Hello,

I haven't followed this thread too closely, in particular I haven't seen 
why the current form of the .DEFERRED_INIT call was chosen or suggested, 
but it triggered my "well, that's obviously wrong" gut feeling; so sorry 
for stating something which might be obvious to thread participants here.  
Anyway:

On Thu, 1 Jul 2021, Richard Sandiford via Gcc-patches wrote:

> >> It's not a bug in SSA, it's at most a missed optimization there.
> >
> > I still think that SSA cannot handle block-scoped variable correctly 
> > in this case, it should not move the variable out side of the block 
> > scope. -:)
> >
> >>  But with -ftrivial-auto-var-init it becomes a correctness issue.
> 
> I might have lost track of what “it” is here.  Do you mean the 
> progation, or the fact that we have a PHI in the first place?
> 
> For:
> 
> unsigned int
> f (int n)
> {
>   unsigned int res = 0;
>   for (int i = 0; i < n; i += 1)
>     {
>       unsigned int foo;
>       foo += 1;

Note that here foo is used uninitialized.  That is the thing from which 
everything else follows.  Garbage in, garbage out.  It makes not too much 
sense to argue that the generated PHI node on this loop (generated because 
of the exposed-upward use of foo) is wrong, or right, or should better be 
something else.  The input program was broken, so anything makes sense.

That is the same with Qings shorter testcase:

  6   for (i = 0; i < a; i++) {
  7     if (__extension__({int size2;
  8         size2 = ART_INIT (size2, 2);

Uninitialized use of size2 right there.  And it's the same for the use of 
.DEFERRED_INIT as the patch does:

{
  int size2;
  size2 = .DEFERRED_INIT (size2, 2);
  size2 = 4;
  _1 = size2 > 5;
  D.2240 = (int) _1;
}

Argument of the pseudo-function-call to .DEFERRED_INIT uninitialized -> 
boom.

You can't solve any of this by fiddling with how SSA rewriting behaves at 
a large scale.  You need to change this and only this specific 
interpretation of a use.  Or preferrably not generate it to start with.

> IMO the foo_3 PHI makes no sense.  foo doesn't live beyond its block,
> so there should be no loop-carried dependency here.
> 
> So yeah, if “it” meant that the fact that variables live too long,
> then I agree that becomes a correctness issue with this feature,
> rather than just an odd quirk.  Could we solve it by inserting
> a clobber at the end of the variable's scope, or something like that?

It would possibly make GCC not generate a PHI on this broken input, yes.  
But why would that be an improvement?

> > Agreed, I have made such change yesterday, and this work around the 
> > current issue.
> >
> > temp = .DEFERRED_INIT (temp/WITH_SIZE_EXPR(temp), init_type)
> >
> > To:
> >
> > temp = .DEFERRED_INIT (SIZE_of_temp, init_type)
> 
> I think we're going round in circles here though.  The point of having
> the undefined input was so that we would continue to warn about undefined
> inputs.  The above issue doesn't seem like a good justification for
> dropping that.

If you want to rely on the undefined use for error reporting then you must 
only generate an undefined use when there was one before, you can't just 
insert new undefined uses.  I don't see how it could be otherwise, as soon 
as you introduce new undefined uses you can and will run into GCC making 
use of the undefinedness, not just this particular issue with lifetime and 
PHI nodes which might be "solved" by clobbers.

I think it's a chicken egg problem: you can't add undefined uses, for 
which you need to know if there was one, but the facility is supposed to 
help detecting if there is one to start with.


Ciao,
Michael.

Reply via email to