I noticed a map-reduce job encountered an
Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
I googled it and released that it was lack of snappy tool.I
Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
In .bashrc added:
export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
export
HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
In core-site.xml added:
<property>
<name>io.compression.codecs</name>
<value>
org.apache.hadoop.io.compress.GzipCodec,
org.apache.hadoop.io.compress.DefaultCodec,
org.apache.hadoop.io.compress.BZip2Codec,
org.apache.hadoop.io.compress.SnappyCodec
</value>
</property>
In mapred-site.xml added:
<property>
<name>mapreduce.map.output.compress</name>
<value>true</value>
</property>
<property>
<name>mapred.map.output.compress.codec</name>
<value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
<property>
<name>mapreduce.admin.user.env</name>
<value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
</property>
In yarn-site.xml added:
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
</property>
Finally make the same setting in the datanode ,but it still didn't
work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so* to
$HADOOP_HOME/lib/native without changing the configuration ,it works well
.Why in this cluster it appears so difficult?