On 15-09-2012 19:13, Jonathan M Davis wrote:
On Saturday, September 15, 2012 15:24:27 Henning Pohl wrote:
On Saturday, 15 September 2012 at 12:49:23 UTC, Russel Winder
wrote:
On Sat, 2012-09-15 at 14:44 +0200, Alex Rønne Petersen wrote:
[…]
Anyway, it's too late to change it now.
I disagree. There are always opportunities to make changes to
things,
you just have manage things carefully.
I don't know if people really use the ability of references being
null. If so, large amounts of code will be broken.
Of course people use it. Having nullable types is _highly_ useful. It would
suck if references were non-nullable. That would be _horrible_ IMHO. Having a
means to have non-nullable references for cases where that makes sense isn't
necessarily a bad thing, but null is a very useful construct, and I'd _hate_
to see normal class references be non-nullable.
- Jonathan M Davis
Out of curiosity: Why? How often does your code actually accept null as
a valid state of a class reference?
I find that more often than not, code is written with the assumption
that null doesn't exist. As a great fan of functional languages, I'm
always sad when a language picks unconstrained null over nullable types
or an Option<T> type.
--
Alex Rønne Petersen
a...@lycus.org
http://lycus.org