unsubscribe

On Fri, Jul 9, 2010 at 10:59 PM, anshul goel <anshu...@gmail.com> wrote:

> unsubscrive
>
>
> On Fri, Jul 9, 2010 at 10:44 PM, Shuja Rehman <shujamug...@gmail.com>wrote:
>
>> Hi All
>>
>> I am facing a hard problem. I am running a map reduce job using streaming
>> but it fails and it gives the following error.
>>
>> Caught: java.lang.OutOfMemoryError: Java heap space
>>        at Nodemapper5.parseXML(Nodemapper5.groovy:25)
>> java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess
>> failed with code 1
>>        at
>> org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:362)
>>        at
>> org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:572)
>>        at
>> org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:136)
>>        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>>        at
>> org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:36)
>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>        at org.apache.hadoop.mapred.Child.main(Child.java:170)
>>
>>
>>
>> I have increased the heap size in hadoop-env.sh and make it 2000M. Also I
>> tell the job manually by following line.
>>
>> -D mapred.child.java.opts=-Xmx2000M \
>>
>> but it still gives the error. The same job runs fine if i run on shell
>> using
>> 1024M heap size like
>>
>> cat file.xml | /root/Nodemapper5.groovy
>>
>>
>> Any clue?????????
>>
>> Thanks in advance.
>>
>>
>> --
>> Regards
>> Shuja-ur-Rehman Baig
>> _________________________________
>> MS CS - School of Science and Engineering
>> Lahore University of Management Sciences (LUMS)
>> Sector U, DHA, Lahore, 54792, Pakistan
>> Cell: +92 3214207445
>>
>
>

Reply via email to