Hello,

What version of Hadoop are you using ?

Regards,

Leon Mergen

> -----Original Message-----
> From: Pallavi Palleti [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, September 17, 2008 2:36 PM
> To: [email protected]
> Subject: OutOfMemory Error
>
>
> Hi all,
>
>    I am getting outofmemory error as shown below when I ran map-red on
> huge
> amount of data.:
> java.lang.OutOfMemoryError: Java heap space
>         at
> org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.jav
> a:52)
>         at
> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:90)
>         at
> org.apache.hadoop.io.SequenceFile$Reader.nextRawKey(SequenceFile.java:1
> 974)
>         at
> org.apache.hadoop.io.SequenceFile$Sorter$SegmentDescriptor.nextRawKey(S
> equenceFile.java:3002)
>         at
> org.apache.hadoop.io.SequenceFile$Sorter$MergeQueue.merge(SequenceFile.
> java:2802)
>         at
> org.apache.hadoop.io.SequenceFile$Sorter.merge(SequenceFile.java:2511)
>         at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.mergeParts(MapTask.jav
> a:1040)
>         at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:698
> )
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:220)
>         at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2124
> The above error comes almost at the end of map job. I have set the heap
> size
> to 1GB. Still the problem is persisting.  Can someone please help me
> how to
> avoid this error?
> --
> View this message in context: http://www.nabble.com/OutOfMemory-Error-
> tp19531174p19531174.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.

Reply via email to