> Firstly, the Map's innards should not be exposed to the user; > what is the > point of having a Map which uses a SoftReference > implementation if users of > the Map have to know this (and thus call the purge() method > when memory is > low (how will they know?)) - the Map should handle the > situation that memory > is low itself.
We can patch it to call purge in its mutator methods, similar to java.util.WeakHashMap. > With all due respect, you are wrong. The whole Map is broken > because keys > can only be purged "manually" by the user calling the purge() > method. There > should be no constraint on the user to do this. The only fix > for this is to > call the "purge()" method EVERY time an Object is added to > the Map, which I > am sure you will agree is a hopelessly non-performant way of > doing things. Or, someone can invoke purge() on a regular basis, using a Timer, or after major alterations to the Map. > It seems odd that I could come across a line of code that said > > SoftRefHashMap m = someAPI.getMeASoftMap(); > > But the Object returned to not use SoftRefs at all. Remember that the > behaviour of (for example) PhantomRefs is starkly different. > Users therefore > may be totally unable to understand the behaviour of their > code! PhantomRef > does not extend SoftRef, so why would PhantomRefHashMap extend > SoftRefHashMap? I really do agree, the class name is misleading. Although now that you mention it, the current implementation won't work for phantom references at all, since the constructor for PhantomReference requires a ReferenceQueue. > >Again, only the values() and entrySet() methods are broken. By > >all means, we should patch them. But existing functionality that > >works, and that people might rely on, should remain. > > No - the Map does not properly reduce in size when the JVM is > running low on > memory but instead requires a "purge()" method to be called. The > implementation of the "purge()" method is terrible because it > has to iterate > through the whole Map. This needs to be done whenever an > Object is put in > the Map. > > This Map without my fix (ie. to negate the possibility of having a > "createReference()" method to be subclassed) is practically > useless. If it > is the "done thing" here not to break backwards compatibility > then fine --> > have someone put my class in as a separate class. But don't > leave a daft > implementation of a class in Commons because someone didn't design it > properly in the first place! Well, the collections package was created by harvesting existing Jarkata code, so someone, somewhere is using this class successfully. I agree it has bugs. We can fix the bugs without screwing our users. Here's a two-step migration path that addresses all of your issues (except the misleading name, which I just don't see has a killer issue): Step 1. For commons 3.0, patch SoftRefHashMap as follows: 1. Fix these methods to conform to the map contract: equals(), hashCode(), containsValue(), values(), entrySet() 2. Deprecate the createReference() method. Provide a new constructor that lets people specify the type of reference to use (probably via static constants); encourage users to use this approach instead of the deprecated method. 3. Alter remove(), put() etc to invoke purge(). Step 2. For commons 4.0, patch SoftRefHashMap as follows: 1. Eliminate createReference(). 2. Rewrite purge() to use a ReferenceQueue. If we want to deprecate the entire class, that's fine, but its replacement should probably go through a design process, to see if it can provide additional features. For instance, why not provide a class that allows for hard, soft, weak or phantom references for both its keys and its values? So that, for instance, keys could be phantom references but values could be soft references; mappings would be removed if the key was garbage collected OR if memory was low. -Paul -- To unsubscribe, e-mail: <mailto:[EMAIL PROTECTED]> For additional commands, e-mail: <mailto:[EMAIL PROTECTED]>
