On 11/12/2018 14:02, Brian Goetz wrote:
Received on the -comments list.


-------- Forwarded Message --------
Subject:        Re: enhanced enums - back from the dead?
Date:   Tue, 11 Dec 2018 11:13:52 +0000
From:   elias vasylenko <eliasvasyle...@gmail.com>
To:     amber-spec-comme...@openjdk.java.net



So going back a little, the suggestion of Rémi is to have the direct
supertype of Foo be Enum<Foo<?>> instead of the proposed Enum<Foo>?

And the problem with this is that with invocation of e.g. noneOf or allOf
with a class literal we have an assignment of Class<Foo> to Class<E>, which
can't satisfy the bound E <: Enum<E>?
Yes, that is the issue

I hope I have understood so far. Then could this not be addressed by also
adjusting the signature of EnumSet, and some of the methods on it (those
which mention Class<E>), such that the type parameters are specified as <E
extends Enum<? extends E>>? Then I think the bound is satisfied as follows:
Changing signature is one way, others have suggested to change typing of class literals to also use wildcards. I think either way we're in a world of migration pain.

Foo <: [E:=Foo]Enum<? extends E>
Foo <: Enum<? extends Foo>
Enum<Foo<?>> <: Enum<? extends Foo>
Foo<?> <: Foo

And the only things that can satisfy the bound on E would be E:=Foo<?>,
E:=Foo, or a capture of E, or the infinite type Enum<? extends Enum<?
extends ...>>

This does seem to create some other problems.

One problem (or rather, an avoidable pitfall) is that the now-legal
aforementioned infinite type would describe an enum set which accepts enum
values belonging to different classes, which means losing static type
safety. But since the infinite type is not denotable we just have to make
sure it can never be inferred anywhere, meaning that the signatures of e.g. the Enum.of methods should retain their existing type parameter bound of <E
extends Enum<E>>, such that we can only infer E:=Foo<?>.
Javac doesn't even support full blown infinite types, but truncates them at the second level of nesting, which will probably cause issues here.

Another problem is that in any existing class which extends EnumSet some of
the overriding method signatures may be made incompatible. I expect this
would require refining the notion of override equivalence of signatures in
the JLS to a notion of override compatibility, where a little flexibility
is allowed in overriding methods to have more specific bounds on type
parameters (so long as the erased signature is unchanged of course).
I don't know if this is feasible, but I think there's an argument that it's
a sensible refinement regardless of the enhanced enums issue. It would be
nice to be able to adjust the bounds on the type parameters of a method to
be less specific without worrying about breaking source compatibility.

Changing signatures (of the bounds of the Enum class, of the bounds of the enum-accepting generic methods) is something that has been considered a year ago - but we are afraid of the source compatibility impact that would be caused by this. As you mention, this will break at the very least overriding/hiding, and there are other more subtle issues (reflection will give you different generic types if you ask for bounds etc.).

Also, I believe that _any_ code out there manipulating enum has effectively copied the EnumSet/EnumMap pattern, with strict fbounds; so while we could fix our JDK types, we could not fix all the other code with similar signatures. Which is what has sent us in a different direction.

Thanks
Maurizio


I'm sure there's a lot that I've overlooked, this is quite difficult to
reason about in the abstract.

Eli

On Mon, 10 Dec 2018 at 15:38, Maurizio Cimadamore <
maurizio.cimadam...@oracle.com> wrote:

On 08/12/2018 12:45,fo...@univ-mlv.fr  wrote:
----- Mail original -----
De: "Maurizio Cimadamore"<maurizio.cimadam...@oracle.com>
À: "Remi Forax"<fo...@univ-mlv.fr>
Cc: "amber-spec-experts"<amber-spec-experts@openjdk.java.net>
Envoyé: Samedi 8 Décembre 2018 00:57:58
Objet: Re: enhanced enums - back from the dead?
[...]

It's not that i don't like the feature, it's that for me it's a
feature you can
not even put in the box of the features that we could do. We start
with "hey we
could do this !" but there are some typing issues. Now, what your are
saying is
that we can use raw types to not have the typing issues, but as i said
above,
you are trading an error to a bunch of warnings, doesn't seems to be a
good
deal*.
I agree that having too many warnings is bad - in my experiment,
although I touched a lot of code, including stream chains, I did not
find them; Comparator.comparing is probably one of the worst beast (and
doesn't work well with target typing even beside generic enums). Not
sure if that shifts the balance one way or another, but point taken.

On this topic, since I was there, I tried to tweak the prototype so that
Enum.values() and Enum.valueOf() return wildcards Foo<?>, but supertype
is Enum<Foo> and this seem to work surprisingly well, both in the tests
I had and in the new one you suggest. Maybe that would minimize the raw
type usage, pushing it quite behind the curtains, and strictly as a
migration aid for APIs such as EnumSet/Map ?
Using Enum<Foo<?>> should also work ? no ?

No, we have tried that path and that doesn't work - ultimately you get
an issue because EnumSet.of is accepting a Class<T>, and, in case of a
class literal, you get back a Class<Foo> (Foo raw). So if supertype says
Class<Foo<?>> you get two incompatible constraints on T, namely Foo and
Foo<?>.

Maurizio

Maurizio

Rémi

Reply via email to