http://search-hadoop.com/m/pF9001VX6SH/default.block.size&subj=Re+about+block+size
http://search-hadoop.com/m/HItS5IClD21/block+size&subj=Newbie+question+on+block+size+calculation

http://www.bodhtree.com/blog/2012/09/28/hadoop-how-to-manage-huge-numbers-of-small-files-in-hdfs/

http://wiki.apache.org/hadoop/HowManyMapsAndReduces

Thanks,
Manoj

From: YouPeng Yang [mailto:[email protected]]
Sent: Thursday, May 09, 2013 9:13 PM
To: [email protected]
Subject: issues with decrease the default.block.size


hi ALL

     I am going to setup a new hadoop  environment, .Because  of  there  are 
lots of small  files, I would  like to change  the  default.block.size to 16MB
other than adopting the ways to merge  the files into large  enough (e.g using  
sequencefiles).
    I want to ask are  there  any bad influences or issues?

Regards

Reply via email to