Dmitriy ,
  Thanks.    pig.splitCombination=false and mapred.min.split.size   combination 
worked.  

Sam

On Apr 23, 2012, at 5:42 PM, Dmitriy Ryaboy wrote:

> Set mapred.min.split.size
> 
> D
> 
> On Mon, Apr 23, 2012 at 4:30 PM, Sam William <[email protected]> wrote:
>> I have a file on  HDFS with a reduced block size.  I created this overriding 
>> the dfs.block.size param on the hadoop fs -put command .  hadoop fsck shows 
>> that this file has 15 blocks (as opposed to the normal 1 block) I did it so 
>> as to force Pig to use  more maps than normal .   On my pig command line   I 
>>  specify ' pig  -Dpig.splitCombination=false ' to turn off the defaulr split 
>> combination logic.   The jobs still ends running just one mapper.  How can I 
>> achieve  multiple maps ?    Splitting the original file into multiple files 
>> would be my last resort.
>> 
>> 
>> 
>> Sam William
>> [email protected]
>> 
>> 
>> 

Sam William
[email protected]



Reply via email to