[ https://issues.apache.org/jira/browse/HADOOP-11363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14238432#comment-14238432 ]
stack commented on HADOOP-11363: -------------------------------- lgtm Want to just go for 4G rather than 2G or you thinking that if we OOME on 2G, we'll take a look at the dumped heaps to see what is going on? Thanks [~ste...@apache.org] > Hadoop maven surefire-plugin uses must set heap size > ---------------------------------------------------- > > Key: HADOOP-11363 > URL: https://issues.apache.org/jira/browse/HADOOP-11363 > Project: Hadoop Common > Issue Type: Bug > Components: build > Affects Versions: 2.7.0 > Environment: java 8 > Reporter: Steve Loughran > Assignee: Steve Loughran > Attachments: HADOOP-11363-001.patch > > > Some of the hadoop tests (especially HBase) are running out of memory on Java > 8, due to there not being enough heap for them > The heap size of surefire test runs is *not* set in {{MAVEN_OPTS}}, it needs > to be explicitly set as an argument to the test run. > I propose > # {{hadoop-project/pom.xml}} defines the maximum heap size and test timeouts > for surefire builds as properties > # modules which run tests use these values for their memory & timeout > settings. > # these modules should also set the surefire version they want to use -- This message was sent by Atlassian JIRA (v6.3.4#6332)