On 11/25/2015 12:42 PM, Andrew Haley wrote:
On 11/24/2015 05:47 PM, Roger Riggs wrote:
Memory is an increasingly critical resource, we should be giving
developers more tools to manage their use of memory. The Weak and
Soft reference forms of the cleaner make it easier to be aware of
and respond to increased memory pressure.
The management of memory is not hypothetical for most large
applications. It was one topic that came up in questions at J1.
Developers do quite a bit of work trying to figure out algorithms
for working caches, serializing to disk, and having the data
available when it is needed. There is a lot of guesswork about how
GC is working and how it behaves at/near the limits. SoftReferences
are a bit of a blunt instrument but they provide evidence an
application can use to regulate its long term memory use.
The WeakReference forms, if provided, can be an alternative to the
ideosyncratic cleanup approaches used in various Weak keyed and weak
values collections. If there are other mechanisms contemplated for
more efficient memory management then perhaps these are not
necessary but if not the current mechanisms should be easier to use.
By "memory" here, do you mean native memory for buffers, etc? I'm
guessing so. If so, I'm not sure that it makes sense to think of this
as a cure for flaky finalization. We've got a cure for early
finalization now with keepAlive() (or whatever it gets called) but
late (or never) finalization is as far as I can see unfixable. IMVHO
it makes more sense to encourage developers to get away from lifecycle
maintenance based on reachability.
Unfortunately the current design for ByteBuffers does not allow
unmap(), so large mapped buffers may hang around for a long time.
4724038 says
"We ... have given this problem a lot of thought ... We have yet to
come up with a way to implement an unmap() method that's safe,
efficient, and plausibly portable across operating systems. We've
explored several other alternatives aside from the two described
above, but all of them were even more problematic. We'd be thrilled
if someone could come up with a workable solution, so we'll leave
this bug open in the hope that it will attract attention from someone
more clever than we are."
I'm very tempted to take a bite at this, but the above text is rather
forbidding. I think I know how to do it. (Famous last words?)
Andrew.
I wish you luck, Andrew. But in case you find any obstacles on the road
which you can't solve, I have an idea for an alternative solution. As I
understand, the problem is when a ByteBuffer that lives long enough is
moved to an old-generation region which is very rarely scanned and so
its life is prolonged more than necessary. If that's the case then what
about the following:
- reserve a special value of object age in the object header to mean:
This is a "Dorian Grey" object. (or if there is a spare bit that could
be used in object header, it could be used to mark it so)
- make collectors treat this value specially (i.e. don't increment it
and treat it as a young object - don't ever move such object to
old-generation)
- have an internal API to patch this value on a given object's header at
construction time (maybe also reset it to initial young value if needed
later)
direct ByteBuffer's could use this internal API to declare themselves
"Dorian Greys" at construction. They would never be moved to
old-generation and so would be scanned frequently enough to be found
phantom-reachable in-time.
I think this could work as there are not many instances of direct
buffers alive at any one time and so this would not affect young
generation too much. I understand that such byte buffers would
frequently have old-to-young links that would have to be tracked with
all the needed overhead. But that might be a good solution anyway for
some applications.
What do you think?
Regards, Peter