On Wednesday, 22 November 2017 at 14:51:02 UTC, codephantom wrote:
The core language of D does NOT need what C# is proposing -
that is my view.
"Need"? Perhaps not. But so far, I haven't seen any arguments
that refute the utility of mitigating patterns of human error.
If, over time, a large number of D programmers have the same
laissez-faire approach towards checking for null, as C#
programmers, then maybe they'll start demanding the same thing
- but even then, I'll argue the same points I've argued thus
far.
Null references have been a problem in every language that has
them. Just because D is much nicer than its predecessors (and
contemporaries, IMO) doesn't mean the "bad old days" (still in
progress) of C and C++ didn't happen or that we cannot or should
not learn from the experience. Tony Hoare doesn't call null his
sin and "billion dollar mistake" as just a fit of pique. In
other words, "Well don't do that, silly human!" ends up being an
appeal to tradition.
Perhaps that's why I've never considered nulls to be an issue.
I take proactive steps to protect my code, before the compiler
ever sees it. And actually, I cannot recall any null related
error in any code I've deployed. It's just never been an issue.
Oh, that explains it. He's a _robot_! ;)
(The IDE thing is entirely irrelevant to this discussion; why did
you bring that up?)
And that's another reason why this topic interests me - why is
it such an issue in the C# community? From Mads blog about it,
it seems to be because they're just not doing null checks. And
so the language designers are being forced to step in. If
that's not the reason, then I've misunderstood, and await the
correct explanation.
Again, it's never _not_ been a problem. That C# is nearly old
enough to vote in general elections but they're only just now
finally doing this should be telling. (And I fully expect this
conversation has been going for at least half of that time.)
It's probably galvanised by the recent proliferation of languages
that hold safety to a higher standard and the community realising
that the language can and _should_ share the burden of mitigating
patterns of human error.
-Wyatt