[ https://issues.apache.org/jira/browse/HADOOP-849?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12461956 ]
Andrzej Bialecki commented on HADOOP-849: ------------------------------------------ I have experienced this problem when trying to debug a mapred application - it was very difficult to figure out what was wrong, because this error was obscuring the real reason (which was a bug in my Mapper class). Also, speaking with my Nutch hat on, if there are plans for substantial API changes in trunk/ it would be good to have a bugfix release, which is still API compatible, and which Nutch could use - there have been tons of fixes since 0.9.2 ... > randomwriter fails with 'java.lang.OutOfMemoryError: Java heap space' in the > 'reduce' task > ------------------------------------------------------------------------------------------ > > Key: HADOOP-849 > URL: https://issues.apache.org/jira/browse/HADOOP-849 > Project: Hadoop > Issue Type: Bug > Reporter: Arun C Murthy > Assigned To: Devaraj Das > Fix For: 0.9.2 > > Attachments: 849.patch > > > Reproducible, tried to increase the child jvm's heapsize via > <property> > <name>mapred.child.java.opts</name> > <value>-Xmx512m</value> > </property> > without any difference, still fails. > Need to investigate further. -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira