> On May 10, 2018, at 11:53 AM, Brian Goetz <brian.go...@oracle.com> wrote: > >> Well, yes, but >> remember also that if `Q` started its career as an object type, it was >> a value-based class, and such classes are documented as being >> null-hostile. The null-passers were in a fool's paradise. > > Objection, Your Honor, assumes facts not in evidence! The letters "n-u-l-l" > do not appear in the definition of value-based linked above. Users have no > reasons to believe any of he following are bad for a VBC, among others: > > V v = null; > > if (v == null) { ... } > > List<V> list = new ArrayList<>(); > list.add(null); > > In Q world, these uses of V were compiled to LV rather than QV, so these > idioms mapped to a natural and sensible translation. Our story was that when > you went to recompile, then the stricter requirements of value-ness would be > enforced, and you'd have to fix your code. (That is: making V a value is > binary compatible but not necessarily source compatible. This was a pretty > valuable outcome.) > > One of the possible coping strategies in Q world for such code is to allow > the box type LV to be denoted at source level in a possibly-ugly way, so that > code like the above could continue to work by saying "V.BOX" instead of "V". > IOW, you can opt into the old behavior, and even mix and match between int > and Integer. So users who wanted to fight progress had some escape hatches. > > While I don't have a specific answer here, I do think we have to back up and > reconsider the assumption that all uses of V were well informed about V's > null-hostility, and have a more nuanced notion of the boundaries between > V-as-value-world and V-as-ref-world. >
If source files tend to be migrated as whole units rather than bit by bit then perhaps this problem can considered similar to compiling with the --release flag? Thus there is no explicit intermixing of the two worlds within one compilation unit. Paul.