"Bryan Sant" <[EMAIL PROTECTED]> writes:

> No doubt that these situations exist.  That of a person not allowing
> the GC to allocate enough memory to be efficient.  However, needing
> double the heap size sounds like severely outdated information to me.
>
> This article is pretty old, but even it disagrees with the literature
> you mention.
> http://www.ibm.com/developerworks/library/j-jtp01274.html

I've read that page before, too.  It's pretty vague and hand-wavy, and
provides no real statistics, but it does give some good information
about how the JVM has improved.  Anyway, you made me go look up the
paper I had read before.  You can find a link to the pdf/ps file here:
http://citeseer.ist.psu.edu/hertz05quantifying.html .

The paper was published in 2005, so it's fairly recent, and it was
performed with Java.  It is far less hand-wavy, contains real
experimental data, and also explains that things are worse than I had
said earlier.  With 3x the heap space, it runs 17% slower than with
explicit memory management.  With only 2x the heap space, it runs 70%
slower.  When you start interacting with the OS's paging subsystem,
you get order of magnitude performance drops in comparison with
explicit management.

I'm a big fan of garbage collection, but it does have a significant
cost in the space efficiency of programs.  I'd like to see further
research done into other schemes, like region-based memory management,
that reduce the space cost when such an optimization is needed, such
as in embedded systems.

                --Levi

/*
PLUG: http://plug.org, #utah on irc.freenode.net
Unsubscribe: http://plug.org/mailman/options/plug
Don't fear the penguin.
*/

Reply via email to