I would first ask if you've installed the native snappy libraries on the
machine?

http://hbase.apache.org/book/snappy.compression.html

That seems to be the likely culprit here.

Thanks,

-Paul

On Sun, Jul 8, 2012 at 9:48 AM, Arvid Warnecke <[email protected]> wrote:

> Hello,
>
> I already found some old entries from mailinglists and articles at
> Cloudera how to use the Snappy library from Hadoop in HBase, but it does
> not seem to work for me.
>
> I installed Hadoop and HBase from the tarballs, because there are no
> packages available for Arch Linux. Everything worked fine, but I am not
> able to use any compression for my tables.
>
> When I use
>
> hbase> create 'table', {NAME=>'fam', COMPRESSION=>'snappy'}
>
> I see in the logs from the regionserver lots of the same error messages:
> 2012-07-07 17:00:17,646 ERROR
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Failed
> open of region=rawdb,,1341672997475.31ecf39289eb5034fb6a3c9f1a0cad2b.
> java.io.IOException: Compression algorithm 'snappy' previously failed
> test.
>         at
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:78)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.checkCompressionCodecs(HRegion.java:2797)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:2786)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:2774)
>         at
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:319)
>         at
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:105)
>         at
> org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:163)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown
> Source)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
> Source)
>         at java.lang.Thread.run(Unknown Source)
>
> I already tried to use the following in the hbase-env.sh file:
>
> export
> HBASE_LIBRARY_PATH=/home/madhatter/CDH3/hadoop/lib/native/Linux-amd64-64
>
> That is where my Cloudera Hadoop & HBase are located, but it seems that
> it does not do the trick. Do I need to set other variables aswell?
> CLASSPATHes or anything like that? Compression seems to be the only
> thing which is not working. When I installed HBase as Cloudera Packages
> in Debian I never had such issues.
>
> Best regards,
> Arvid
>
> --
> [ Arvid Warnecke ][ arvid (at) nostalgix (dot) org ]
> [ IRC/OPN: "madhatter" ][ http://www.nostalgix.org ]
> ---[  ThreePiO was right: Let the Wookiee win.  ]---
>

Reply via email to