[ http://issues.apache.org/jira/browse/HADOOP-817?page=comments#action_12458993 ] Devaraj Das commented on HADOOP-817: ------------------------------------
I think we need to investigate this further. Maybe the JVM is not spawned with enough memory. I am saying this because this same loop merges the map outputs even for the sort benchmark and in the sort benchmark this loop successfully merges 1000s of files. > Streaming reducers throw OutOfMemory for not so large inputs > ------------------------------------------------------------ > > Key: HADOOP-817 > URL: http://issues.apache.org/jira/browse/HADOOP-817 > Project: Hadoop > Issue Type: Bug > Components: contrib/streaming > Reporter: Sanjay Dahiya > Assigned To: Sanjay Dahiya > > I am seeing OutOfMemoryError for moderate size inputs (~70 text files, 20k > each ) causing job to fail in streaming. For very small inputs it still > succeeds. Looking into details. -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira
