As you said, there is a problem with the lossy conversions, and i'm confused
why you want to be able to invert them given that you can not.
You keep saying this, but it keeps not making sense. Can we try to find
exactly where you are getting uncomfortable (and NOT veer into unrelated
Valhalla questions -- pretend null just doesn't exist for how.)
Let's start with the easy ones:
Object p = "foo"; // widen String to Object
Object q = Integer.valueOf(3) // widen Integer to Object
...
if (p instanceof String s) { ... } // yes it is
if (q instanceof String s) { ... } // no it isn't
We can widen String and Integer to Object; we can safely narrow p back
to String, but we can't do so for q, because it is "outside the range"
of references to String (which embeds in "references to Object".)
OK, now let's do int and long.
long small = 3
long big = Long.MAX_VALUE
if (small instanceof int a) { ... } // yes, it is
if (big instanceof int b) { ... } // no, it isn't
What these questions are asking is: can I safely narrow these longs to
int, just like the above. In the first case, I can -- just like with
String and Object. In the second, I can't -- just like with Integer and
Object.
OK, now let's do int and double.
double zero = 0;
double pi = 3.14d;
if (zero instanceof int i) { ... }
if (pi instanceof int i) { ... }
Same thing! The first exactly encodes a number that is representable in
int (i.e., could have arisen from widening an int to double), the latter
does not.
Before you dive into somewhere else, if you have a problem with any of
these, please try to state it clearly. Acceptable answers include:
- yes, this is all fine, my problem is somewhere else
- no, I have X problem with *exactly* these cases
if you prefer, float instanceof int make little sense given that the widening
conversion from int to float is lossy.
It's the same thing as with an unsafe cast, o instanceof List<String> make little
sense, because converting List<String> to Object is lossy.
- it does not work well with Valhalla, especially with the phase 3
(i.e. let's pretend that a primitive type is a value type) *
As I said, I think this is a distraction, but if you disagree, you are
going to need to provide a much more exhaustive description of how you
think Valhalla will work and why this is a conflict, than just appealing
to claims like "won't work with Valhalla." Or, alternately, if you
want to focus entirely on that, then that's fine, start a new thread
(but I would expect that mail to have more "how would X work" questions
rather than assertions about how it would work.)
With Valhalla, Integer is seen as int | null, with Integer! being an equivalent
of int.
Sadly Integer! is not int because Integer! is a subtype of Object while int is
not,
but we can try to provide an integration that brush that as a detail.
So for the pattern matching, it would be sad if int and Integer! behave
differently know that we know that we want to try to retrofit the primitive
type to be value type.
The problem with the semantics you propose is that the behavior of a switch on
an int and the switch on an Integer! is wildly different.
By example,
switch(v) {
case byte b -> ...
case int i -> ...
}
if v is an int the switch compiles, or if v is an Integer (or a Integer! when
we will get them) the switch does not compile.
- your interpretation is surprising when i've tried to explain it to
others people (even to people coming to JUGs).
I believe that, but as an educator, you should well know that often
"surprise" is not an indication of "wrong theory" but "wrong user mental
model", which implies that what is needed is more education. If we
consistently designed the language for least-surprise, we would very
quickly find ourselves in a pit where we cannot add anything, both
because something always surprises a 1995-era programmer, and because we
would have added layers and layers of ad-hoc "what we think the user
expects" features that we would eventually collapse under our own
weight. (The "implicit conversion from float to int" is an example of
this kind of mistake, and look where that got us.)
The problem is the inverse here, people welcome the change, the ability to use
primitive types as a pattern,
the problem is that the behaviors is different from what people expects.
So it may make sense to have patterns that behave the way the JEP describe, but
as pattern methods and not as type pattern.
regards,
Rémi