What these questions are asking is: can I safely narrow these
        longs to int, just like the above.  In the first case, I can
        -- just like with String and Object.  In the second, I can't
        -- just like with Integer and Object. Do we agree these are
        the same?


Remi: Yes Socrates, I believe they are.

No, they are not.
When you widen an Integer to an Object, it stays an Integer at runtime, when you widen an int to a long, you transform it to a long, the original int is lost.

Lost?  Numbers can be lost?  OMG, that's terrible!  Can you imagine if we lost a _really_ important number, like one or zero?  Society would collapse.  Something must be done!

Oh, wait, that's not how it works.  Numbers, being platonic ideals, just ARE.  I can summon zero into being at will:

    int a = 0;

Can I summon multiple zeroes?  Let me try:

    int a = 0;
    int b = 0;

No, I can't, they are the same zero.  I cannot discern any difference between the zero in a and the zero in b (this is the "substitutibility" criteria on which Valhalla equality is based.)

If I put zero in an int, and then move it to a long, and back, can I tell the difference?

    int a = 0;
    long b = a;
    int c = (int) b;

No, I cannot; the zero in a and the zero in c are the same value. And -- crucially -- the zero in *b* is not a "different zero".  This concept has been embedded in the JLS forever.  JLS 4.2 defines byte, short, int, and long as binary encodings of _integers_.  JLS 5.2 allows us to assign

    byte b = 0;

even though the type of the RHS is int, not byte, because there is a narrowing conversion from int to byte (and similar) for a compile time constant when _the value of the constant expression is representable in the type of the variable_.  The JLS recognizes that "byte 0" and "int 0" are different encodings of _the same value_. The zero is not "lost" when it is converted to a long, or a byte. It is still zero.

when you widen an int to a long, you transform it to a long, the original int is lost.

This interpretation feels a lot like the people who say "but instanceof is a type test, full stop."  (It is understandable, because they have only experienced it as a type test in their lifetime, but just because no one has ever built a house on the empty land across from you, doesn't mean that no one ever can.) This is a "who moved my cheese" reaction.  Now, as a French person, I realize you take cheese very seriously.  But the cheese isn't lost, it is just moved.  Va le chercher et profitez, ma petite souris.

Did we *have* to do this right now?  No, we didn't; we could have continued to allow pattern matching on primitives to be invariant, at least for a while.  But that would be increasingly problematic as Valhalla adds new numeric types; we'd be able to convert them via assignment, but not ask if they are the same.  Similarly, it would make constant patterns more difficult.

Your puzzlers -- like the objection about lossy conversions -- are not about pattern matching, they are merely that pattern matching reveals existing decisions about the language that we might not, in hindsight, like.

Here are examples (puzzlers) exposing the rift.

  char letter = 'A';
  switch(letter) {
    case short s -> IO.println("short");
    case char c -> IO.println("char");
  }

Here you have a character, typed as a char but it prints "short".

Your objection here is that that `char` is not really a character type, it is a 16 bit signed integral type, with support for literals that look like characters.  But Java will freely convert between characters and other integral types; this is your objection, and pattern matching merely makes it more obvious.  Same with your other example.

Moreover, with Valhalla, we want to try to make primitive to looks more like value class, we talk about by example allowing ArrayList<int> as an example, but this goal is in conflict with the semantics of the primitive type pattern JEP, because the JEP semantics is based on a world where primitives and objects do not talk to each others.

A better way to think about this is that this JEP is preparing the way for pattern matching on values.  When Valhalla is complete, the world of references and primitives will give way to one of identity objects and value objects.  Identity objects get their polymorphism through inheritance (which is transitive); value objects get their polymorphism through point-to-point conversions.  This JEP takes the first step by providing a unified framework for type patterns, based on the set of conversions permitted in cast context (the most permissive set of conversions), and the notion of _exact conversion_ (was the conversion lossy or not.)  This sets the stage for pattern matching in Valhalla.

Here is another example:
  record Box<T>(T value) {}
  int value = 42;
  switch(new Box<>(value)) {
    case byte b -> IO.println(b);
    case int i -> IO.println(i);
  }

This one does no compile because the int value is automatically auto-promoted to an Integer, which follows different rules than the primitive rules (per this JEP).

I'm not sure what your objection is, but like the other examples, I am sure it has nothing to do with pattern matching.  The semantics of switch are defined relative to the selector expression, the value and type of which has nothing to do with the patterns in the body of the switch.  `new Box<>(value)` is an expression with its own type; we can then match on it with patterns that are applicable to that type.  Whether or not we can express primitive type patterns doesn't affect the typing of the selector.

Your claim that this "inhibits" the unification of objects and primitives is a very big one.  Go take as much time as you need to capture this argument clearly and persuasively.

Reply via email to