Do u have enough data to start more than one mapper?
 If entire data is less than a block size then only 1 mapper will run.

Best Regards,
Anil

On Feb 1, 2012, at 4:21 PM, Mark Kerzner <[email protected]> wrote:

> Hi,
> 
> I have a simple MR job, and I want each Mapper to get one line from my
> input file (which contains further instructions for lengthy processing).
> Each line is 100 characters long, and I tell Hadoop to read only 100 bytes,
> 
> job.getConfiguration().setInt("mapreduce.input.linerecordreader.line.maxlength",
> 100);
> 
> I see that this part works - it reads only one line at a time, and if I
> change this parameter, it listens.
> 
> However, on a cluster only one node receives all the map tasks. Only one
> map tasks is started. The others never get anything, they just wait. I've
> added 100 seconds wait to the mapper - no change!
> 
> Any advice?
> 
> Thank you. Sincerely,
> Mark

Reply via email to