Nick Sabalausky wrote:
"Georg Wrede" <[email protected]> wrote in message
news:[email protected]...
Nick Sabalausky wrote:
"Kagamin" <[email protected]> wrote in message
I doubt that blunt non-null forcing will solve this problem. If you're
forced to use non-null, you'll invent a means to fool the compiler, some
analogue of null reference - a stub object, which use will result into
the same bug, with the difference that application won't crash
immediately, but will behave in unpredictable way, at some point causing
some other exception, so eventually you'll get your crash. Profit will
be infinitesimal if any.
The idea is that non-null would not be forced, but rather be the default
with an optional nullable for the times when it really is needed.
This is interesting. I wonder what the practical result of non-null as
default will be. Do programmers bother to specify nullable when needed, or
will they "try to do the [perceived] Right Thing" by assigning stupid
default values?
If the latter happens, then we really are worse off than with nulls.
Then searching for the elusive bug will be much more work.
Interesting point. We should probably keep an eye on the languages that use
the "Foo" vs "Foo?" syntax for non-null vs nullable to see what usage
patterns arise. Although, I generally have little more than contempt for
programmers who blindly do what they were taught (by other amateurs) is
usually "the right thing" without considering whether it really is
appropriate for the situation at hand.
Although I would think that there must be plenty of examples of things we
already use that could make things worse if people used them improperly.
An interesting thought occurred to me just now. IIRC, Walter's argument
to always zeroing memory at allocation, was to give "sensible starting
values" and to "more easily see if data is uninitialised".
If assignment before use is compulsory, then we don't need to zero out
memory anymore. This ought to speed data intensive tasks up.