[ 
https://issues.apache.org/jira/browse/HADOOP-12261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14660182#comment-14660182
 ] 

Alan Burlison commented on HADOOP-12261:
----------------------------------------

I've tried setting -Xmx2048m and I got a GC time exceeded error, at 3Gb it 
seems to work but as I only get a small fraction of the way through the test 
suite due to other multiple failures I'm hesitant to say I have it fixed. I 
can't even get the Hadoop test suite to run successfully on a stripped-down VM 
with a vanilla network configuration. The parlous state of the Hadoop test 
suite is a major blocker, to the point where I'm beginning to question if I can 
realistically complete the addition of Solaris support - if I can't test Hadoop 
then I'll never know if the changes are actually correct or not.

> Surefire needs to make sure the JVMs it fires up fit within the memory 
> available
> --------------------------------------------------------------------------------
>
>                 Key: HADOOP-12261
>                 URL: https://issues.apache.org/jira/browse/HADOOP-12261
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: test
>    Affects Versions: 2.7.1
>            Reporter: Alan Burlison
>            Assignee: Alan Burlison
>
> hadoop-project/pom.xml sets maven-surefire-plugin.argLine to include 
> -Xmx4096m. Allocating  that amount of memory requires a 64-bit JVM, but on 
> platforms with both 32 and 64-bit JVMs surefire runs the 32 bit version by 
> default and tests fail to start as a result. "-d64" should be added to the 
> command-line arguments to ensure a 64-bit JVM is always used.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to