On Monday, 26 January 2015 at 22:12:24 UTC, Laeeth Isharc wrote:
" If Java consumes 15% more power doing it, does
it matter on a PC? Most people don't dare. Does it matter for
small-scale server environments? Maybe not. Does it matter
when you deploy Hadoop on a 10,000 node cluster, and the
holistic inefficiency (multiple things running concurrently)
goes to 30%? Ask the people who sign the checks for the power
bill. Unfortunately, inefficiency scales really well.
No, Java does not consume 15% doing it, because there isn't
just one implementation of Java compilers.
Most comercial JVMs do offer the capability of ahead of time
native code compilation or JIT caches.
So when those 15% really matter, enterprises do shell out the
money for such JVMs.
Oracle commercial JVM and the OpenJDK are just the reference
implementation.
Thanks for the colour. (For clarity, the content from the link
wasn't by me, and I meant the general gist rather than the
details). How do commercial JVMs rate in terms of memory usage
against thoughtful native (D) code implementations? Is the
basic point mistaken?
So far I just dabbled in D, because our customers choose the
platforms, not we.
However, these are the kind of tools you get to analyse
performance in commercial JVMs,
http://www.oracle.com/technetwork/java/javaseproducts/mission-control/java-mission-control-1998576.html
http://www.oracle.com/technetwork/server-storage/solarisstudio/features/performance-analyzer-2292312.html
Just providing the examples from Oracle, other vendors have
similar tools.
With them, you can drill down the whole JVM and interactions at
the OS level and find performance bottlecks all the way down to
generated Assembly code.
As for memory usage, Atego JVMs run in quite memory constrained
devices.
Here is the tiniest of them,
http://www.atego.com/products/atego-perc-ultra/
--
Paulo