Hi all,

Thanks for reply.

Finally it worked by following post
http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U8Pui-9ZuZY
.

Issue was:

By default hadoop-2 comes with 32 bit native library and without native
support so i need to compile source code of hadoop for 64 bit  architecture
with native support. After that it worked fine.



On Mon, Jul 14, 2014 at 3:09 PM, Stack <[email protected]> wrote:

> On Mon, Jul 14, 2014 at 7:49 AM, Hanish Bansal <
> [email protected]> wrote:
>
> > Hi All,
> >
> > We have tried below things:
> >
> > 1. Pointed HBase to hadoop and snappy libraries which hadoop holds :
> >
> > export HBASE_LIBRARY_PATH=/pathtoyourhadoop/lib/native/Linux-amd64-64
> >
> > As hadoop holds hadoop and snappy library, it should work. But it didn't.
> >
> > 2. Copied libhadoop.so and libsnappy.so to hbase native library folder
> > at $HBASE_HOME/lib/native/Linux-amd64-64/.
> >
> > It also didn't work.
> >
> > *Run a compression test using tool, getting below error:*
> >
> > [root@IMPETUS-I0141 hbase-0.98.3-hadoop2]# bin/hbase
> > org.apache.hadoop.hbase.util.
> >
> > CompressionTest file:///tmp/test.txt snappy
> > 2014-07-11 16:05:10,572 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > 2014-07-11 16:05:11,006 WARN  [main] util.NativeCodeLoader: Unable to
> load
> > native-hadoop library for your platform... using builtin-java classes
> where
> > applicable
> > 2014-07-11 16:05:11,241 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2014-07-11 16:05:11,242 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> >     at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:62)
> >     at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:131)
> >     at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:147)
> >     at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:162)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >     at
> >
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >     at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >     at
> >
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >     at
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >
> > Everything was working fine with hbase-0.94.5 as well as hbase-0.96.1.
> >
> >
> >
> It is not finding your native lib.   Did you do this on the command line or
> in the hbase-env.sh script?
>
> export HBASE_LIBRARY_PATH=/pathtoyourhadoop/lib/native/Linux-amd64-64
>
> Did you change 'pathtoyourhadoop' to point at actual library?  Is the lib
> there?  Is your architecture Linux-amd64-64 for sure? (See the text I
> posted last night for how to figure what your system is)
>
> St.Ack
>



-- 
*Thanks & Regards*
*Hanish Bansal*

Reply via email to