yes. please add in the yarn-site.xml:
  <property>
    <name>yarn.nodemanager.admin-env</name>
   
<value>MALLOC_ARENA_MAX=$MALLOC_ARENA_MAX,LD_LIBRARY_PATH=$HADOOP_COMMON_HOME/lib/native:/
opt/snappy/lib:$LD_LIBRARY_PATH</value>
  </property>

<property>
    <name>yarn.app.mapreduce.am.admin.user.env</name>
    <value>LD_LIBRARY_PATH=$HADOOP_COMMON_HOME/lib/native:/opt/snappy/lib:
$LD_LIBRARY_PATH</value>
  </property>

add in the mapred-site.xml:

  <property>
    <name>mapreduce.admin.user.env</name>
    <value>LD_LIBRARY_PATH=$HADOOP_COMMON_HOME/lib/native:/opt/snappy/lib
:$LD_LIBRARY_PATH</value>
  </property>

then restart yarn.


On Mon, Mar 16, 2015 at 2:54 PM, donhoff_h <[email protected]> wrote:

> Hi, Azuryy.
>
> Thanks for your reply.
>
> Is there any way which does not require to copy the snappy libs? I mean is
> there any environment variable that I can configure so that I don't have to
> copy the snappy libs. I have tried JAVA_LIBRARY_PATH and LD_LIBRARY_PATH.
> But they didn't work.
>
>
> ------------------ Original ------------------
> *From: * "Azuryy Yu";<[email protected]>;
> *Send time:* Monday, Mar 16, 2015 2:43 PM
> *To:* "[email protected]"<[email protected]>;
> *Subject: * Re: Snappy Configuration in Hadoop2.5.2
>
> Hi,
> please :
> run "cp -a /opt/snappy/lib/libsnappy.* /opt/hadoop/hadoophome/lib/native"
> on each datanode. also you need to install Snappy on each datanode firstly.
>
>
>
>
>
>
>
> On Sat, Mar 7, 2015 at 6:57 PM, donhoff_h <[email protected]> wrote:
>
>> Hi, experts.
>>
>> I meet the following problem when configuring the Snappy lib in
>> Hadoop2.5.2
>>
>> My snappy installation home is /opt/snappy
>> My Hadoop installation home is /opt/hadoop/hadoophome
>>
>> To configure the snappy path, I tried to add the following environment
>> variables in /etc/profile and hadoop-env.sh :
>> export JAVA_LIBRARY_PATH=/opt/hadoop/hadoophome/lib/native:/opt/snappy/lib
>> export LD_LIBRARY_PATH=/opt/hadoop/hadoophome/lib/native:/opt/snappy/lib‍
>>
>> After the configuration, I ran the command "hadoop checknative". The
>> result showed as following which I think means the hadoop can find the
>> snappy lib:
>> Native library checking:
>> hadoop: true /opt/hadoop/hadoop-2.5.2/lib/native/libhadoop.so.1.0.0
>> zlib:   true /lib64/libz.so.1
>> snappy: true /opt/snappy/lib/libsnappy.so.1
>> lz4:    true revision:99
>> bzip2:  false‍
>>
>> But when I ran a MapReduce Job, it reported the following error:
>> Error: java.lang.RuntimeException: native snappy library not available:
>> SnappyCompressor has not been loaded.
>>     at
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:69)
>>     at
>> org.apache.hadoop.io.compress.SnappyCodec.createCompressor(SnappyCodec.java:143)
>>     at
>> org.apache.hadoop.io.compress.SnappyCodec.createOutputStream(SnappyCodec.java:98)
>>     at
>> org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:136)
>> ‍
>>
>> I also tried to set io.compression.codecs, but it did not work too.
>>
>> The only way I found worked is to make a soft link as following:
>> ln -s /opt/snappy/lib/libsnappy.so.1.2.1
>> /opt/hadoop/hadoophome/lib/native/libsnappy.so.1
>>
>> I used to config snappy in Hadoop2.4.0 successfully. I remembered that I
>> only need to config the LD_LIBRARY_PATH in /etc/profile. There is no need
>> to make a such soft link. Does Hadoop2.5.2 not support this configuration
>> anymore? Or is there other ways to config in Hadoop2.5.2 which don't
>> require to make links or have to copy the lib to the hadoop's lib/native
>> directory?
>>
>> Many Thanks!
>>
>
>

Reply via email to