[ 
https://issues.apache.org/jira/browse/HADOOP-3644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12612353#action_12612353
 ] 

Matei Zaharia commented on HADOOP-3644:
---------------------------------------

It's up to you whether to raise the limit by default or not, but the reason I'd 
suggest doing it is to avoid confusing new users. It's very confusing to check 
out a stable release of Hadoop and see it failing unit tests. That's what 
happened to me - I checked out Hadoop, I made some small changes, I tried 
running the tests, and this completely unrelated test was failing. 64-bit Linux 
is not an esoteric platform these days and raising the limit to 512 will 
probably work fine on other platforms, so why have the unit tests fail on 
purpose for this configuration?

> TestLocalJobControl test gets OutOfMemoryError on 64-bit Java
> -------------------------------------------------------------
>
>                 Key: HADOOP-3644
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3644
>             Project: Hadoop Core
>          Issue Type: Bug
>            Reporter: Matei Zaharia
>            Priority: Trivial
>             Fix For: 0.17.2, 0.18.0
>
>         Attachments: testmem.patch
>
>
> The TestLocalJobControl unit test fails on 64-bit Java on Linux with an 
> OutOfMemoryError. Here is the exact Java environment:
> $ java -version
> java version "1.5.0_07"
> Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_07-b03)
> Java HotSpot(TM) 64-Bit Server VM (build 1.5.0_07-b03, mixed mode)
> The test runs fine with 32-bit Java. The problem is likely that some of the 
> data structures become bigger when using 64-bit pointers. As a fix, I've 
> suggested simply increasing the memory available to JUnit.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to