Hi!

I've had a

2012-02-18 13:11:08,347 WARN  mapred.LocalJobRunner - job_local_0001
java.lang.OutOfMemoryError: Java heap space
    at org.apache.hadoop.mapred.IFile$Reader.readNextBlock(IFile.java:342)
    at org.apache.hadoop.mapred.IFile$Reader.next(IFile.java:404)
    at org.apache.hadoop.mapred.Merger$Segment.next(Merger.java:220)
at org.apache.hadoop.mapred.Merger$MergeQueue.adjustPriorityQueue(Merger.java:330)
    at org.apache.hadoop.mapred.Merger$MergeQueue.next(Merger.java:350)
    at org.apache.hadoop.mapred.Merger.writeFile(Merger.java:156)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.mergeParts(MapTask.java:1535) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1154)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:359)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177) 2012-02-18 13:11:08,603 ERROR fetcher.Fetcher - Fetcher: java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252)
    at org.apache.nutch.fetcher.Fetcher.fetch(Fetcher.java:1204)
    at org.apache.nutch.fetcher.Fetcher.run(Fetcher.java:1240)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.nutch.fetcher.Fetcher.main(Fetcher.java:1213)


on ~175K links.

AFAIK, I have two options: 1) increase a number of segments through bin/nutch generate tool, and 2) increase heap memory through "mapred.map.child.java.opts" parameter

Can I do both of them in 'local' mode?

If not, could you please tell me how could I do it?

Reply via email to