Can control heap size using 'mapred.child.java.opts' option.

Check ur program logic though. Personal experience is that running out
of heap space in map task usually suggests some runaway logic somewhere.

-----Original Message-----
From: Rui Shi [mailto:[EMAIL PROTECTED] 
Sent: Thursday, December 06, 2007 12:31 PM
To: [email protected]
Subject: Mapper Out of Memory


Hi,

I run hadoop on a BSD4 clusters and each map task is a gzip file (about
10MB). Some tasks finished. But many of them failed due to heap out of
memory. I got the following syslogs:

2007-12-06 12:16:50,277 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2007-12-06 12:16:53,128 INFO org.apache.hadoop.mapred.MapTask:
numReduceTasks: 256
2007-12-06 12:16:53,638 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2007-12-06 12:18:19,079 WARN org.apache.hadoop.mapred.TaskTracker: Error
running child
java.lang.OutOfMemoryError: Java heap space
Does anyone know what is the reason and how should we avoid it?

Thanks,

Rui





 
________________________________________________________________________
____________
Never miss a thing.  Make Yahoo your home page. 
http://www.yahoo.com/r/hs

Reply via email to