On Sun, Jul 13, 2014 at 10:28 PM, Esteban Gutierrez <[email protected]>
wrote:

> Hello Ankit,
>
> The only reason the test can fail in the master is that the snappy natives
> libraries are not installed correctly . Have you tried to run the
> compression test  (hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp snappy) in the master? does it works? If it works correctly
> then you only need to restart the HBase masters in order to get it working.
>
> cheers,
> esteban.
>
>
>
>
What Esteban says.

We added a little bit more on how hbase finds native libs to the doc but
have not pushed it out to the website.  Perhaps it will help in this case
(pardon the formatting):

     <note xml:id="hbase.native.platform"><title>On the location of native
libraries</title>

         <para>Hadoop looks in <filename>lib/native</filename> for .so
files.  HBase looks in
             <filename>lib/native/PLATFORM</filename>.  See the
<command>bin/hbase</command>.
             View the file and look for <varname>native</varname>.  See how
we
             do the work to find out what platform we are running on
running a little java program
             <classname>org.apache.hadoop.util.PlatformName</classname> to
figure it out.
             We'll then add <filename>./lib/native/PLATFORM</filename> to
the
             <varname>LD_LIBRARY_PATH</varname> environment for when the
JVM starts.
             The JVM will look in here (as well as in any other dirs
specified on LD_LIBRARY_PATH)
             for codec native libs.  If you are unable to figure your
'platform', do:
             <programlisting>$ ./bin/hbase
org.apache.hadoop.util.PlatformName</programlisting>.
             An example platform would be <varname>Linux-amd64-64</varname>.
             </para>

     </note>

St.Ack

Reply via email to