I am running my Pig scripts on our QA cluster (with 4 datanoes, see blelow)
and has Cloudera CDH2 release installed and global heap max is ­Xmx4096m. I
am constantly getting OutOfMemory errors (see below) on my map and reduce
jobs, when I try run my script against large data where it produces around
600 maps. 
Looking for some tips on the best configuration for pig and to get rid of
these errors. Thanks.

   

Error: GC overhead limit exceededError: java.lang.OutOfMemoryError: Java
heap space

Regards
Syed

Reply via email to