Hi,

It is hard to believe that you need to enlarge heap size given the input size 
is only 10MB. In particular, you don't load all input at the same time. As for 
the program logic, no much fancy stuff, mostly cut and sorting. So GC should be 
able to handle...

Thanks,

Rui


----- Original Message ----
From: Joydeep Sen Sarma <[EMAIL PROTECTED]>
To: [email protected]
Sent: Thursday, December 6, 2007 1:14:51 PM
Subject: RE: Mapper Out of Memory


Can control heap size using 'mapred.child.java.opts' option.

Check ur program logic though. Personal experience is that running out
of heap space in map task usually suggests some runaway logic
 somewhere.

-----Original Message-----
From: Rui Shi [mailto:[EMAIL PROTECTED] 
Sent: Thursday, December 06, 2007 12:31 PM
To: [email protected]
Subject: Mapper Out of Memory


Hi,

I run hadoop on a BSD4 clusters and each map task is a gzip file (about
10MB). Some tasks finished. But many of them failed due to heap out of
memory. I got the following syslogs:

2007-12-06 12:16:50,277 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2007-12-06 12:16:53,128 INFO org.apache.hadoop.mapred.MapTask:
numReduceTasks: 256
2007-12-06 12:16:53,638 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2007-12-06 12:18:19,079 WARN org.apache.hadoop.mapred.TaskTracker:
 Error
running child
java.lang.OutOfMemoryError: Java heap space
Does anyone know what is the reason and how should we avoid it?

Thanks,

Rui





 
________________________________________________________________________
____________
Never miss a thing.  Make Yahoo your home page. 
http://www.yahoo.com/r/hs






      
____________________________________________________________________________________
Looking for last minute shopping deals?  
Find them fast with Yahoo! Search.  
http://tools.search.yahoo.com/newsearch/category.php?category=shopping

Reply via email to