>From: "Jack, Paul" <[EMAIL PROTECTED]>
>
> > Well, the existing "purge()" method iterated through the
> > whole Map checking
> > each reference so see if it had been cleared. The new
> > "purge()" method
> > simply calls "processQueue()". This does not iterate through
> > the whole Map,
> > but instead uses the ReferenceQueue to directly get the
> > references that have
> > been cleared and remove them from the Map. This should be
> > much quicker.
>
>I agree that a ReferenceQueue improves performance, but why
>deprecate the public purge() method?
Firstly, the Map's innards should not be exposed to the user; what is the
point of having a Map which uses a SoftReference implementation if users of
the Map have to know this (and thus call the purge() method when memory is
low (how will they know?)) - the Map should handle the situation that memory
is low itself.
> > Backwardly compatible? The class is broken as it is! People
> > whose code
> > extends the current version doesn't work anyway...
>
>The entrySet() and values() methods are broken, and should
>be patched; the createReference() method works fine,
>and there may exist perfectly valid code that requires it.
With all due respect, you are wrong. The whole Map is broken because keys
can only be purged "manually" by the user calling the purge() method. There
should be no constraint on the user to do this. The only fix for this is to
call the "purge()" method EVERY time an Object is added to the Map, which I
am sure you will agree is a hopelessly non-performant way of doing things.
> > The functionality is questionable not only because
> > composition should be
> > favoured over inheritance; it seems madness that I can extend
> > a class called
> > SoftRefHashMap to NOT be a SoftRefHashMap.
>
>"Madness?" That's a bit extreme. I agree the name is misleading
>but that could be easily addressed in the documentation, without
>breaking anybody's code.
It seems odd that I could come across a line of code that said
SoftRefHashMap m = someAPI.getMeASoftMap();
But the Object returned to not use SoftRefs at all. Remember that the
behaviour of (for example) PhantomRefs is starkly different. Users therefore
may be totally unable to understand the behaviour of their code! PhantomRef
does not extend SoftRef, so why would PhantomRefHashMap extend
SoftRefHashMap?
>
>And there are also ways of migrating to a composition pattern
>that wouldn't require breaking compatibility.
>
> > The extra "subclass-able" functionality CANNOT exist in my
> > implementation
>
>Couldn't you use a specialized subclass of ReferenceQueue?
I have spent a bit of time on this, and it simply isn't feasible. The only
sensible way of designing this class is to use a Reference-type which holds
the key object - hence I cannot rely on the "createReference()" method to
create a Reference with the required functionality.
>Again, only the values() and entrySet() methods are broken. By
>all means, we should patch them. But existing functionality that
>works, and that people might rely on, should remain.
No - the Map does not properly reduce in size when the JVM is running low on
memory but instead requires a "purge()" method to be called. The
implementation of the "purge()" method is terrible because it has to iterate
through the whole Map. This needs to be done whenever an Object is put in
the Map.
This Map without my fix (ie. to negate the possibility of having a
"createReference()" method to be subclassed) is practically useless. If it
is the "done thing" here not to break backwards compatibility then fine -->
have someone put my class in as a separate class. But don't leave a daft
implementation of a class in Commons because someone didn't design it
properly in the first place!
_________________________________________________________________
Join the world�s largest e-mail service with MSN Hotmail.
http://www.hotmail.com
--
To unsubscribe, e-mail: <mailto:[EMAIL PROTECTED]>
For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>