On 10/25/06, Stephen Colebourne <[EMAIL PROTECTED]> wrote:
From: Niall Pemberton <[EMAIL PROTECTED]>
> From a simplistic user perspective seems to me that the compatibility
> achieved by the JDK is successful. I don't understand the intricacies
> required to generify a library, but if we could do the same wouldn't
> this be the best solution from a user perspective?
>
> Perhaps you could expand on what the issues are with the JDK approach
> and why they're not desirable in Commons Collections - when I look at
> the differences for example between the generified and non-generified
> versions of java.util.List, I don't see the mess you describe.

The problem is erasure. The JDK wipes all knowledge of the type that you 
connect to the collection. Thus

List<String> list = new ArrayList();
if (list instanceof List<Integer>) {
}

fails to compile as the String type is erased.

In order to preserve backwards compatability, Sun had to go to extreme lengths 
to ensure that the erased type of each generics element corresponds to the 
previous type of the method. Consider the max method on Collections:

public static <T extends Object & Comparable<? super T>> T max(Collection<? 
extends T> coll)

I agree that this is a mess from a library Developer POV - but from a
user perspective isn't it pretty neat that it works for both old
ungenerified code and new generified code?

Where because Object comes before Comparable in the list of bound types, its 
Object that the method signature gets erased to. Now thats pretty nasty.

Just to clarify are you calling the method declaration nasty or the
fact that it gets erased to an Object nasty? To be honest I still
don't fully get why Sun did it that way even after reading the
tutorial [1] which has it as a specific example. I guess I'm probably
showing my generic naivety, but the following would seem simpler with
a check/cast to Comparable within the method:

   public static <T> T max(Collection<T> coll)

Anyway, apologies for digressing - I realize this isn't generics 101.

For [collections], there are cases worse than this, such as MultiMap, where the 
sf projects have demonstrated that it is not possible to generify the class 
without breaking backwards compatability, typically because our interface isn't 
a true implementation of the original interface.

So perhaps MultiMap should remain unchanged (deprecated?) and a new
MultiMap2 provide a proper generified implementation. Surely this is
much less painful for the few cases than changing the name of
everything.

Beyond this, there are some classes (like TypedList that we don't even want to 
port as they'd be pointless), plus my desire to create a smaller jar file (time 
depending), there is ample reason to not worry excessively about backwards 
compatability. We shoud target about 90% backwards compatible, with the rest 
being fixing API flaws and issues we have now.

Don't you mean 0% backwards compatability - which is what changing the
package name results in?

A simple port may appeal conceptually, but its not really viable at all.

I would disagree - if it really is the case that 90% could be ported
with backwards compatibility, then it would seem viable IMO.

Having said that, its not me thats going to be doing the work, but it
does seem valuable to discuss port vs. refactor rather than refactor
being a defacto decission and just having an argument on package
names.

Niall

Stephen

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to