On Mon, Nov 14, 2022, 10:49 David Brown <da...@westcontrol.com> wrote:

>
>
> On 14/11/2022 16:10, NightStrike wrote:
> >
> >
> > On Mon, Nov 14, 2022, 04:42 David Brown via Gcc <gcc@gcc.gnu.org
>
> >
> >     Warnings are not perfect - there is always the risk of false
> positives
> >     and false negatives.  And different people will have different ideas
> >     about what code is perfectly reasonable, and what code is risky and
> >     should trigger a warning.  Thus gcc has warning flag groups (-Wall,
> >     -Wextra) that try to match common consensus, and individual flags for
> >     personal fine-tuning.
> >
> >     Sometimes it is useful to have a simple way to override a warning in
> >     code, without going through "#pragma GCC diagnostic" lines (which are
> >     powerful, but not pretty).
> >
> >     So if you have :
> >
> >              int i;
> >              if (a == 1) i = 1;
> >              if (b == 1) i = 2;
> >              if (c == 1) i = 3;
> >              return i;
> >
> >     the compiler will warn that "i" may not be initialised.  But if you
> >     /know/ that one of the three conditions will match (or you don't care
> >     what "i" is if it does not match), then you know your code is fine
> and
> >     don't want the warning.  Writing "int i = i;" is a way of telling the
> >     compiler "I know what I am doing, even though this code looks dodgy,
> >     because I know more than you do".
> >
> >     It's just like writing "while ((*p++ = *q++));", or using a cast to
> >     void
> >     to turn off an "unused parameter" warning.
> >
> >
> > Wouldn't it be easier, faster, and more obvious to the reader to just
> > use "int i = 0"? I'm curious what a real world use case is where you
> > can't do the more common thing if =0.
> >
>
> You can write "int i = 0;" if you prefer.  I would not, because IMHO
> doing so would be wrong, unclear to the reader, less efficient, and
> harder to debug.
>
> In the code above, the value returned should never be 0.  So why should
> "i" be set to 0 at any point?  That's just an extra instruction the
> compiler must generate (in my line of work, my code often needs to be
> efficient).  More importantly, perhaps, it means that if you use
> diagnostic tools such as sanitizers you are hiding bugs from them
> instead of catching them - a sanitizer could catch the case of "return
> i;" when "i" is not set.
>
> (I don't know if current sanitizers will do that or not, and haven't
> tested it, but they /could/.)
>
> But I'm quite happy with :
>
>         int i = i;      // Self-initialise to silence warning
>
> I don't think there is a "perfect" solution to cases like this, and
> opinions will always differ, but self-initialisation seems a good choice
> to me.  Regardless of the pros and cons in this particular example, the
> handling of self-initialisation warnings in gcc is, AFAIUI, to allow
> such code for those that want to use it.


Thanks for the extended explanation, insight,  and detail!

Reply via email to