Thanks for your help :)
To make sure I manually set LD_LIBRARY_PATH, LIBRARY_PATH, and
HBASE_LIBRARY_PATH
bash-3.2$ export
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
bash-3.2$ export
LIBRARY_PATH=$LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
bash-3.2$ export
HBASE_LIBRARY_PATH=/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
But running the compression test failed with "native snappy library not
available"
bash-3.2$ ./hbase org.apache.hadoop.hbase.util.CompressionTest
file:///tmp/test.txt snappy
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.conf.Configuration).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Exception in thread "main" java.lang.RuntimeException: native snappy
library not available
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:121)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:104)
at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:118)
at
org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:236)
at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:588)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:178)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:150)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:140)
at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:104)
at
org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
at
org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:137)
I verified that libsnappy is indeed, installed
bash-3.2$ ls -al $HBASE_LIBRARY_PATH
total 1412
drwxr-xr-x 2 1106 592 4096 Feb 11 01:06 .
drwxr-xr-x 3 1106 592 4096 Feb 11 01:06 ..
-rw-r--r-- 1 1106 592 616862 Feb 11 01:06 libhadoop.a
-rwxr-xr-x 1 1106 592 1051 Feb 11 01:06 libhadoop.la
lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libhadoop.so ->
libhadoop.so.1.0.0
lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libhadoop.so.1 ->
libhadoop.so.1.0.0
-rwxr-xr-x 1 1106 592 340361 Feb 11 01:06 libhadoop.so.1.0.0
-rw-r--r-- 1 1106 592 184418 Feb 11 01:06 libhdfs.a
-rwxr-xr-x 1 1106 592 1034 Feb 11 01:06 libhdfs.la
lrwxrwxrwx 1 1106 592 16 Feb 27 18:12 libhdfs.so -> libhdfs.so.0.0.0
lrwxrwxrwx 1 1106 592 16 Feb 27 18:12 libhdfs.so.0 -> libhdfs.so.0.0.0
-rwxr-xr-x 1 1106 592 125455 Feb 11 01:06 libhdfs.so.0.0.0
-rw-r--r-- 1 1106 592 37392 Feb 11 01:06 libsnappy.a
lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libsnappy.so ->
libsnappy.so.1.1.1
lrwxrwxrwx 1 1106 592 18 Feb 27 18:12 libsnappy.so.1 ->
libsnappy.so.1.1.1
-rw-r--r-- 1 1106 592 26824 Feb 11 01:06 libsnappy.so.1.1.1
Just for grins and giggles I re-ran this as root
In addition to the Exception mentioned above, I also got following Warning:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
Any ideas?
On Tue, 28 Feb 2012 20:02:38 -0500, Stack <[email protected]> wrote:
On Tue, Feb 28, 2012 at 1:52 PM, Peter Naudus <[email protected]>
wrote:
What else can I do to fix / diagnose this problem?
Does our little compression tool help?
http://hbase.apache.org/book.html#compression.test
St.Ack
--