Thanks for your reply. However I think it is not about 32-bit version issue, cus my Hadoop is 64-bit as I compiled it from source. I think my way to install snappy should be wrong,
Arthur On 19 Aug, 2014, at 11:53 pm, Andre Kelpe <ake...@concurrentinc.com> wrote: > Could this be caused by the fact that hadoop no longer ships with 64bit libs? > https://issues.apache.org/jira/browse/HADOOP-9911 > > - André > > > On Tue, Aug 19, 2014 at 5:40 PM, arthur.hk.c...@gmail.com > <arthur.hk.c...@gmail.com> wrote: > Hi, > > I am trying Snappy in Hadoop 2.4.1, here are my steps: > > (CentOS 64-bit) > 1) > yum install snappy snappy-devel > > 2) > added the following > (core-site.xml) > <property> > <name>io.compression.codecs</name> > > <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value> > </property> > > 3) > mapred-site.xml > <property> > <name>mapreduce.admin.map.child.java.opts</name> > <value>-server -XX:NewRatio=8 > -Djava.library.path=/usr/lib/hadoop/lib/native/ > -Djava.net.preferIPv4Stack=true</value> > <final>true</final> > </property> > <property> > <name>mapreduce.admin.reduce.child.java.opts</name> > <value>-server -XX:NewRatio=8 > -Djava.library.path=/usr/lib/hadoop/lib/native/ > -Djava.net.preferIPv4Stack=true</value> > <final>true</final> > </property> > > 4) smoke test > bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar > teragen 100000 /tmp/teragenout > > I got the following warning, actually there is no any test file created in > hdfs: > > 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in > mapreduce.admin.map.child.java.opts can cause programs to no longer function > if hadoop native libraries are used. These values should be set as part of > the LD_LIBRARY_PATH in the map JVM env using mapreduce.admin.user.env config > settings. > 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in > mapreduce.admin.reduce.child.java.opts can cause programs to no longer > function if hadoop native libraries are used. These values should be set as > part of the LD_LIBRARY_PATH in the reduce JVM env using > mapreduce.admin.user.env config settings. > > Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1? or > what would be wrong? or is my new change in mapred-site.xml incorrect? > > Regards > Arthur > > > > > > > > > -- > André Kelpe > an...@concurrentinc.com > http://concurrentinc.com