On Thu, Jun 1, 2023 at 3:42 PM Brian Goetz <[email protected]> wrote:
Another way to think about this is the historical progression. > > In hardware / assembly language, when you load a value from a register or > memory location, and you haven't provably put a good value there, you get > an out-of-thin-air value. There is no "safe initialization" of anything; > there is just memory and registers. > > C more or less propagates this approach; if you don't initialize a > variable, you get what you get. C doesn't force indirections on you; you > have to ask for them. > > Java 1.0 moved the ball forward for direct storage, saying that the > initial value of any heap variable is zero, and requires that locals be > initialized before use. It also moves the ball forward by putting > indirections in for you where they are semantically needed. But Java > copies what C did for primitives; in the desire to not make arithmetic > ungodly expensive, an int is just a direct machine int, like in C, with a > frosting of bzero. > > Valhalla gives us a choice: we can flatten more non-nullable things, or we > can cheapen (somewhat) the things that use null as a initialization guard. > The good thing is that we can choose which we want as the situation > warrants; the bad thing is we can make bad choices. > > People will surely make bad choices to get the flattening benefits of B3, > because, performance! But this is not all that different from the other > bad performance-overrotations that people make in Java every day, other > than this one is new and so people will initially fall into it more. > I'm not necessarily following the impact of these statements on the arguments I'm making. I don't think I'm raising concerns about people picking the wrong bucket. I'm trying to establish that there's never anything actually *good* about default initialization; that at the very best it's "harmless and very slightly convenient", no more. A typing saver in exchange for bug risk. Notably it's at its most harmless for nullable types, which are the more likely ones to blow up outright when used uninitialized. But those aren't the cases this thread is focusing on. As far as I know, Valhalla could *maybe* still decide to require that > non-nullable variables of value-class types be initialized > explicitly. Maybe it would complicate migration too much, I'm not sure. I > know it would require one new feature: > > `new ValClass[100] (int i -> something())` > > For fields, we can close the gap by doing null checks at constructor end > (in the same place we emit memory barriers to support the final field > initialization safety guarantees.) The latter is voided when `this` > escapes construction, and the former would be as well, but this seems a > pragmatic choice which narrows the initially-null problem quite a bit for > fields. > I do like that, but again I think it's addressing a different set of issues than I'm trying to. My brain is certainly fuzzy, though. Again I'd say that initialization problems that leave something *null* are probably the least-harmful kind, thanks to our beloved friend NullPointerException. I'm wondering why we shouldn't require fields of non-nullable value-class types to be explicitly initialized. `Complex x = new Complex(0, 0)` or `Complex x = new Complex()`. I'll stipulate "people would grumble" as self-evident. > As you point out, arrays are harder, and it requires something much like > what you suggest (which is also a useful feature in its own right.) Note > that none of this is needed for B3!, only for B1!/B2!. > . . . > Nothing wrong with `new B3![n]`, any more than `new int[n]`. It's B1/B2 > that have the problem. > I think I *am* talking about B3? If you could reread my message it might help. Or you might just tell me to reread yours (which I would). :-) -- Kevin Bourrillion | Java/Kotlin Ecosystem Team | Google, Inc. | [email protected]
