Hi all,

   I am getting outofmemory error as shown below when I ran map-red on huge
amount of data.: 
java.lang.OutOfMemoryError: Java heap space
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:52)
        at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:90)
        at
org.apache.hadoop.io.SequenceFile$Reader.nextRawKey(SequenceFile.java:1974)
        at
org.apache.hadoop.io.SequenceFile$Sorter$SegmentDescriptor.nextRawKey(SequenceFile.java:3002)
        at
org.apache.hadoop.io.SequenceFile$Sorter$MergeQueue.merge(SequenceFile.java:2802)
        at 
org.apache.hadoop.io.SequenceFile$Sorter.merge(SequenceFile.java:2511)
        at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.mergeParts(MapTask.java:1040)
        at 
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:698)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:220)
        at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2124
The above error comes almost at the end of map job. I have set the heap size
to 1GB. Still the problem is persisting.  Can someone please help me how to
avoid this error?
-- 
View this message in context: 
http://www.nabble.com/OutOfMemory-Error-tp19531174p19531174.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Reply via email to