are you using cloudera CD3, if so you only need to install hadoop-0.20-native
On Sun, Dec 2, 2012 at 12:57 AM, Jean-Marc Spaggiari <[email protected]> wrote: > Sorry, I forgot to paste few maybe useful lines. I have the lib in > /usr/local/lib copied properly, and I have the HBASE_LIBRARY_PATH set > correctly. Do I need to restart HBase to run this test? > > hbase@node3:~/hbase-0.94.2$ export HBASE_LIBRARY_PATH=/usr/local/lib/ > hbase@node3:~/hbase-0.94.2$ bin/hbase > org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt snappy > 12/12/01 18:55:29 INFO util.ChecksumType: > org.apache.hadoop.util.PureJavaCrc32 not available. > 12/12/01 18:55:29 INFO util.ChecksumType: Checksum can use java.util.zip.CRC32 > 12/12/01 18:55:29 INFO util.ChecksumType: > org.apache.hadoop.util.PureJavaCrc32C not available. > 12/12/01 18:55:29 DEBUG util.FSUtils: Creating file:/tmp/test.txtwith > permission:rwxrwxrwx > 12/12/01 18:55:29 WARN util.NativeCodeLoader: Unable to load > native-hadoop library for your platform... using builtin-java classes > where applicable > 12/12/01 18:55:29 WARN metrics.SchemaConfigured: Could not determine > table and column family of the HFile path /tmp/test.txt. Expecting at > least 5 path components. > 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native library is available > 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native library not loaded > Exception in thread "main" java.lang.RuntimeException: native snappy > library not available > at > org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123) > at > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100) > at > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112) > at > org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:264) > at > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:739) > at > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:127) > at > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:118) > at > org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:101) > at > org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394) > at > org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108) > at > org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138) > hbase@node3:~/hbase-0.94.2$ ll /usr/local/lib/ > total 572 > -rw-r--r-- 1 root staff 391614 déc 1 18:33 libsnappy.a > -rwxr-xr-x 1 root staff 957 déc 1 18:33 libsnappy.la > lrwxrwxrwx 1 root staff 18 déc 1 18:33 libsnappy.so -> libsnappy.so.1.1.3 > lrwxrwxrwx 1 root staff 18 déc 1 18:33 libsnappy.so.1 -> > libsnappy.so.1.1.3 > -rwxr-xr-x 1 root staff 178210 déc 1 18:33 libsnappy.so.1.1.3 > drwxrwsr-x 4 root staff 4096 jui 13 10:06 python2.6 > drwxrwsr-x 4 root staff 4096 jui 13 10:06 python2.7 > hbase@node3:~/hbase-0.94.2$ > > > 2012/12/1, Jean-Marc Spaggiari <[email protected]>: >> Hi, >> >> I'm currently using GZip and want to move to Snappy. >> >> I have downloaded the tar file, extracted, build, make install, make >> check, everything is working fine. >> >> However, I'm not able to get this working: >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt snappy >> 12/12/01 18:46:21 WARN snappy.LoadSnappy: Snappy native library not loaded >> Exception in thread "main" java.lang.RuntimeException: native snappy >> library not available >> at >> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123) >> >> Sound like HBase is not able to find the native library. How can I >> tell HBase where the library is? >> >> Thanks, >> >> JM >> -- Håvard Wahl Kongsgård Faculty of Medicine & Department of Mathematical Sciences NTNU http://havard.security-review.net/
