A Guy With an Opinion wrote:

That is true, but I'm still unconvinced that making the person's program likely to error is better than initializing a number to 0. Zero is such a fundamental default for so many things. And it would be consistent with the other number types.
basically, default initializers aren't meant to give a "usable value", they meant to give a *defined* value, so we don't have UB. that is, just initialize your variables explicitly, don't rely on defaults. writing:

        int a;
        a += 42;

is still bad code, even if you're know that `a` is guaranteed to be zero.

        int a = 0;
        a += 42;

is the "right" way to write it.

if you'll look at default values from this PoV, you'll see that NaN has more sense that zero. if there was a NaN for ints, ints would be inited with it too. ;-)

Reply via email to