[
https://issues.apache.org/jira/browse/HADOOP-3644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12611382#action_12611382
]
Steve Loughran commented on HADOOP-3644:
----------------------------------------
* Java 64 bit often requires 1.5X memory as its pointers are bigger. JRockit
uses "short" pointers for references in the nearest 4GB memory, so avoids a lot
of memory bloat. But it has other "issues".
* I've debated tweaking Ant for a different multiplier on 64 bit JVMs, so the
memory can get scaled up.
> TestLocalJobControl test gets OutOfMemoryError on 64-bit Java
> -------------------------------------------------------------
>
> Key: HADOOP-3644
> URL: https://issues.apache.org/jira/browse/HADOOP-3644
> Project: Hadoop Core
> Issue Type: Bug
> Reporter: Matei Zaharia
> Priority: Trivial
> Fix For: 0.17.2, 0.18.0
>
> Attachments: testmem.patch
>
>
> The TestLocalJobControl unit test fails on 64-bit Java on Linux with an
> OutOfMemoryError. Here is the exact Java environment:
> $ java -version
> java version "1.5.0_07"
> Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_07-b03)
> Java HotSpot(TM) 64-Bit Server VM (build 1.5.0_07-b03, mixed mode)
> The test runs fine with 32-bit Java. The problem is likely that some of the
> data structures become bigger when using 64-bit pointers. As a fix, I've
> suggested simply increasing the memory available to JUnit.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.