This command will give you the exact name:

java org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"

Can you try to run it?

But it's most probably Linux-amd64-64



2014-08-26 20:24 GMT-04:00 [email protected] <
[email protected]>:

> Hi,
>
> Thanks!
>
> A question:
> If I run:
> $  uname -m
> x86_64
>
> Should I use " lib/native/Linux-amd64-64” or  "lib/native/x86_64”  in
> $HADOOP_HOME and $HBASE_HOME?
>
> Arthur
>
>
> On 27 Aug, 2014, at 8:10 am, Jean-Marc Spaggiari <[email protected]>
> wrote:
>
> > Ok.
> >
> > This is the way the lib path is built:
> >
> > JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
> > ${HBASE_HOME}/build/native/${JAVA_PLATFORM}/lib)
> >
> > And JAVA_PLATFORM comes from JAVA_PLATFORM=`CLASSPATH=${CLASSPATH}
> ${JAVA}
> > org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
> >
> > You can double check it doing:
> >
> > # Adjust to you java_home...
> > export JAVA_HOME=/usr/local/jdk1.7.0_45/
> >
> > export CLASSPATH=`bin/hbase classpath`
> > $JAVA_HOME/bin/java org.apache.hadoop.util.PlatformName | sed -e "s/
> /_/g"
> >
> > Result for me is this: Linux-amd64-64. Might  be different for you.
> >
> > Then you link the libs the way Alex said before:
> > cd lib/native/Linux-amd64-64
> > ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so .
> > ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1 .
> >
> > AND.....
> >
> > The hadoop so too! And I think this is what's missing for you:
> > ln -s /YOURHADOOPPATH/libhadoop.so .
> >
> > Your folder should look like this:
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ tree
> > .
> > └── Linux-amd64-64
> >    ├── libhadoop.so
> >    ├── libsnappy.so -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so
> >    └── libsnappy.so.1 -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1
> >
> > I copied libhadoop.so instead of doing a link because it was not
> available
> > on this computer.
> >
> > Then test it:
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test
> snappy
> > 2014-08-26 20:06:43,987 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > 2014-08-26 20:06:44,831 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-26 20:06:44,832 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > 2014-08-26 20:06:45,125 INFO  [main] compress.CodecPool: Got brand-new
> > compressor [.snappy]
> > 2014-08-26 20:06:45,131 INFO  [main] compress.CodecPool: Got brand-new
> > compressor [.snappy]
> > 2014-08-26 20:06:45,254 INFO  [main] compress.CodecPool: Got brand-new
> > decompressor [.snappy]
> > SUCCESS
> >
> >
> > Please let us know if it still doesn't work for you. Without libhadoop.so
> > it doesn't work for me...
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ rm
> > Linux-amd64-64/libhadoop.so
> >
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test
> snappy
> > 2014-08-26 20:09:28,945 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > 2014-08-26 20:09:29,460 WARN  [main] util.NativeCodeLoader: Unable to
> load
> > native-hadoop library for your platform... using builtin-java classes
> where
> > applicable
> > 2014-08-26 20:09:29,775 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-26 20:09:29,776 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> > ...
> >
> >
> > I did all of that using a brand new extracted
> > hbase-0.98.5-hadoop2-bin.tar.gz file.
> >
> > JM
> >
> >
> > 2014-08-26 19:47 GMT-04:00 [email protected] <
> > [email protected]>:
> >
> >> $ uname -m
> >> x86_64
> >>
> >> Arthur
> >>
> >> On 27 Aug, 2014, at 7:45 am, Jean-Marc Spaggiari <
> [email protected]>
> >> wrote:
> >>
> >>> Hi Arthur,
> >>>
> >>> What uname -m gives you? you need to check that to create the right
> >> folder
> >>> under the lib directory.
> >>>
> >>> JM
> >>>
> >>>
> >>> 2014-08-26 19:43 GMT-04:00 Alex Kamil <[email protected]>:
> >>>
> >>>> Something like this worked for me
> >>>> 1. get hbase binaries
> >>>> 2. sudo yum install snappy snappy-devel
> >>>> 3. ln -sf /usr/lib64/libsnappy.so
> >>>> /var/lib/hadoop/lib/native/Linux-amd64-64/.
> >>>> 4. ln -sf /usr/lib64/libsnappy.so
> >>>> /var/lib/hbase/lib/native/Linux-amd64-64/.
> >>>> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
> >>>> ref: https://issues.apache.org/jira/browse/PHOENIX-877
> >>>>
> >>>>
> >>>> On Tue, Aug 26, 2014 at 7:25 PM, [email protected] <
> >>>> [email protected]> wrote:
> >>>>
> >>>>> Hi,
> >>>>>
> >>>>> I just tried three more steps but was not able to get thru.
> >>>>>
> >>>>>
> >>>>> 1) copied  snappy files to $HBASE_HOME/lib
> >>>>> $ cd $HBASE_HOME
> >>>>> $ ll lib/*sna*
> >>>>> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
> >>>>> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>>>> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
> >>>> lib/snappy-java-1.0.4.1.jar
> >>>>>
> >>>>> ll lib/native/
> >>>>> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
> >>>>>
> >>>>> ll lib/native/Linux-amd64-64/
> >>>>> total 18964
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so
> ->
> >>>>> libhadoopsnappy.so.0.0.1
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0
> >> ->
> >>>>> libhadoopsnappy.so.0.0.1
> >>>>> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08
> >> libhadoopsnappy.so.0.0.1
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
> >>>>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so
> ->
> >>>>> libprotobuf-lite.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08
> libprotobuf-lite.so.8
> >> ->
> >>>>> libprotobuf-lite.so.8.0.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
> >>>> libprotobuf-lite.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
> >>>>> libprotobuf.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> >>>>> libprotobuf.so.8.0.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
> >>>>> libprotoc.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> >>>>> libprotoc.so.8.0.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
> >>>>> libsnappy.so.1.2.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
> >>>>> libsnappy.so.1.2.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> >>>>> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
> >>>>>
> >>>>> 2)  $HBASE_HOME/conf/hbase-env.sh, added
> >>>>>
> >>>>> ###
> >>>>> export
> >>>>>
> >>>>
> >>
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> >>>>> export
> >>>>>
> >>>>
> >>
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>>>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> >>>>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> >>>>> ###
> >>>>>
> >>>>> 3) restart HBASE and tried again
> >>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test snappy
> >>>>> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum
> using
> >>>>> org.apache.hadoop.util.PureJavaCrc32
> >>>>> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can
> >> use
> >>>>> org.apache.hadoop.util.PureJavaCrc32C
> >>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>>>> library not available: this version of libhadoop was built without
> >> snappy
> >>>>> support.
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>>>       at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>>>       at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>>>
> >>>>>
> >>>>> Regards
> >>>>> Arthur
> >>>>>
> >>>>>
> >>>>>
> >>>>> On 27 Aug, 2014, at 6:27 am, [email protected] <
> >>>>> [email protected]> wrote:
> >>>>>
> >>>>>> Hi Sean,
> >>>>>>
> >>>>>> Thanks for your reply.
> >>>>>>
> >>>>>> I tried the following tests
> >>>>>>
> >>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test gz
> >>>>>> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>>> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum
> using
> >>>>> org.apache.hadoop.util.PureJavaCrc32
> >>>>>> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
> >>>> use
> >>>>> org.apache.hadoop.util.PureJavaCrc32C
> >>>>>> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
> >>>>> loaded & initialized native-zlib library
> >>>>>> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got
> brand-new
> >>>>> compressor [.gz]
> >>>>>> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got
> brand-new
> >>>>> compressor [.gz]
> >>>>>> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got
> brand-new
> >>>>> decompressor [.gz]
> >>>>>> SUCCESS
> >>>>>>
> >>>>>>
> >>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test snappy
> >>>>>> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>>> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum
> using
> >>>>> org.apache.hadoop.util.PureJavaCrc32
> >>>>>> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
> >>>> use
> >>>>> org.apache.hadoop.util.PureJavaCrc32C
> >>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>>>> library not available: this version of libhadoop was built without
> >> snappy
> >>>>> support.
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>>>>     at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>>>>     at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>>>>
> >>>>>>
> >>>>>> $ hbase shell
> >>>>>> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>>> HBase Shell; enter 'help<RETURN>' for list of supported commands.
> >>>>>> Type "exit<RETURN>" to leave the HBase Shell
> >>>>>> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
> >>>>>>
> >>>>>> hbase(main):001:0>
> >>>>>> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
> >>>>> 'snappy'}
> >>>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>>>
> >>>>>> ERROR: java.io.IOException: Compression algorithm 'snappy'
> previously
> >>>>> failed test.
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
> >>>>>>     at
> >>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
> >>>>>>     at
> >>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> >>>>>>     at
> >>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
> >>>>>>     at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> >>>>>>     at
> >>>>>
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> >>>>>>     at
> >>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> >>>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> >>>>>>     at java.lang.Thread.run(Thread.java:662)
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> Regards
> >>>>>> Arthur
> >>>>>>
> >>>>>>
> >>>>>> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <[email protected]>
> >> wrote:
> >>>>>>
> >>>>>>> Hi Arthur!
> >>>>>>>
> >>>>>>> Our Snappy build instructions are currently out of date and I'm
> >>>> working
> >>>>> on updating them[1]. In short, I don't think there are any special
> >> build
> >>>>> steps for using snappy.
> >>>>>>>
> >>>>>>> I'm still working out what needs to be included in our instructions
> >>>> for
> >>>>> local and cluster testing.
> >>>>>>>
> >>>>>>> If you use the test for compression options, locally things will
> fail
> >>>>> because the native hadoop libs won't be present:
> >>>>>>>
> >>>>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test snappy
> >>>>>>> (for comparison, replace "snappy" with "gz" and you will get a
> >> warning
> >>>>> about not having native libraries, but the test will succeed.)
> >>>>>>>
> >>>>>>> I believe JM's suggestion is for you to copy the Hadoop native
> >>>>> libraries into the local HBase lib/native directory, which would
> allow
> >>>> the
> >>>>> local test to pass. If you are running in a deployed Hadoop cluster,
> I
> >>>>> would expect the necessary libraries to already be available to
> HBase.
> >>>>>>>
> >>>>>>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
> >>>>>>>
> >>>>>>> -Sean
> >>>>>>>
> >>>>>>>
> >>>>>>> On Tue, Aug 26, 2014 at 8:30 AM, [email protected] <
> >>>>> [email protected]> wrote:
> >>>>>>> Hi JM
> >>>>>>>
> >>>>>>> Below are my commands, tried two cases under same source code
> folder:
> >>>>>>> a) compile with snappy parameters(failed),
> >>>>>>> b) compile without snappy parameters (successful).
> >>>>>>>
> >>>>>>> Regards
> >>>>>>> Arthur
> >>>>>>>
> >>>>>>> wget
> >>>>>
> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> >>>>>>> tar -vxf hbase-0.98.4-src.tar.gz
> >>>>>>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> >>>>>>> cd  hbase-0.98.4-src_snappy
> >>>>>>> nano dev-support/generate-hadoopX-poms.sh
> >>>>>>> (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
> >>>>>>>
> >>>>>>>
> >>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> 0.98.4-hadoop2
> >>>>>>> a) with snappy parameters
> >>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [WARNING] The POM for
> >>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
> >>>>> dependency information available
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Reactor Summary:
> >>>>>>> [INFO]
> >>>>>>> [INFO] HBase ............................................. SUCCESS
> >>>>> [8.192s]
> >>>>>>> [INFO] HBase - Common .................................... SUCCESS
> >>>>> [5.638s]
> >>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>>>> [1.535s]
> >>>>>>> [INFO] HBase - Client .................................... SUCCESS
> >>>>> [1.206s]
> >>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>>>> [0.193s]
> >>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>>>> [0.798s]
> >>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>>>> [0.438s]
> >>>>>>> [INFO] HBase - Server .................................... FAILURE
> >>>>> [0.234s]
> >>>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
> >>>>>>> [INFO] HBase - Thrift .................................... SKIPPED
> >>>>>>> [INFO] HBase - Shell ..................................... SKIPPED
> >>>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
> >>>>>>> [INFO] HBase - Examples .................................. SKIPPED
> >>>>>>> [INFO] HBase - Assembly .................................. SKIPPED
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] BUILD FAILURE
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Total time: 19.474s
> >>>>>>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> >>>>>>> [INFO] Final Memory: 51M/1100M
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >>>>> resolve dependencies for project
> >>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
> >>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>>> http://maven.oschina.net/content/groups/public/ was cached in the
> >> local
> >>>>> repository, resolution will not be reattempted until the update
> >> interval
> >>>> of
> >>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>>> [ERROR]
> >>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
> with
> >>>>> the -e switch.
> >>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> >> logging.
> >>>>>>> [ERROR]
> >>>>>>> [ERROR] For more information about the errors and possible
> solutions,
> >>>>> please read the following articles:
> >>>>>>> [ERROR] [Help 1]
> >>>>>
> >>>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>>> [ERROR]
> >>>>>>> [ERROR] After correcting the problems, you can resume the build
> with
> >>>>> the command
> >>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> b) try again, without snappy parameters
> >>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> -Prelease
> >>>>>>> [INFO] Building tar:
> >>>>>
> >>>>
> >>
> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Reactor Summary:
> >>>>>>> [INFO]
> >>>>>>> [INFO] HBase ............................................. SUCCESS
> >>>>> [3.290s]
> >>>>>>> [INFO] HBase - Common .................................... SUCCESS
> >>>>> [3.119s]
> >>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>>>> [0.972s]
> >>>>>>> [INFO] HBase - Client .................................... SUCCESS
> >>>>> [0.920s]
> >>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>>>> [0.167s]
> >>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>>>> [0.504s]
> >>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>>>> [0.382s]
> >>>>>>> [INFO] HBase - Server .................................... SUCCESS
> >>>>> [4.790s]
> >>>>>>> [INFO] HBase - Testing Util .............................. SUCCESS
> >>>>> [0.598s]
> >>>>>>> [INFO] HBase - Thrift .................................... SUCCESS
> >>>>> [1.536s]
> >>>>>>> [INFO] HBase - Shell ..................................... SUCCESS
> >>>>> [0.369s]
> >>>>>>> [INFO] HBase - Integration Tests ......................... SUCCESS
> >>>>> [0.443s]
> >>>>>>> [INFO] HBase - Examples .................................. SUCCESS
> >>>>> [0.459s]
> >>>>>>> [INFO] HBase - Assembly .................................. SUCCESS
> >>>>> [13.240s]
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] BUILD SUCCESS
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Total time: 31.408s
> >>>>>>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> >>>>>>> [INFO] Final Memory: 57M/1627M
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
> >>>>> [email protected]> wrote:
> >>>>>>>
> >>>>>>>> Hi Arthur,
> >>>>>>>>
> >>>>>>>> How have you extracted HBase source and what command do you run to
> >>>>> build? I
> >>>>>>>> will do the same here locally so I can provide you the exact step
> to
> >>>>>>>> complete.
> >>>>>>>>
> >>>>>>>> JM
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> 2014-08-26 8:42 GMT-04:00 [email protected] <
> >>>>> [email protected]
> >>>>>>>>> :
> >>>>>>>>
> >>>>>>>>> Hi JM
> >>>>>>>>>
> >>>>>>>>> Not too sure what you mean, do you mean I should create a new
> >>>> folder
> >>>>> in my
> >>>>>>>>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
> >>>>> folder
> >>>>>>>>> then try to compile it again?
> >>>>>>>>>
> >>>>>>>>> Regards
> >>>>>>>>> ARthur
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
> >>>>> [email protected]>
> >>>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>>> Hi Arthur,
> >>>>>>>>>>
> >>>>>>>>>> Almost done! You now need to copy them on the HBase folder.
> >>>>>>>>>>
> >>>>>>>>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v
> .jar
> >>>> |
> >>>>> grep
> >>>>>>>>> -v
> >>>>>>>>>> .rb
> >>>>>>>>>> .
> >>>>>>>>>> ├── native
> >>>>>>>>>> │   └── Linux-x86
> >>>>>>>>>> │       ├── libsnappy.a
> >>>>>>>>>> │       ├── libsnappy.la
> >>>>>>>>>> │       ├── libsnappy.so
> >>>>>>>>>> │       ├── libsnappy.so.1
> >>>>>>>>>> │       └── libsnappy.so.1.2.0
> >>>>>>>>>>
> >>>>>>>>>> I don't have any hadoop-snappy lib in my hbase folder and it
> works
> >>>>> very
> >>>>>>>>>> well with Snappy for me...
> >>>>>>>>>>
> >>>>>>>>>> JM
> >>>>>>>>>>
> >>>>>>>>>> 2014-08-26 8:09 GMT-04:00 [email protected] <
> >>>>>>>>> [email protected]
> >>>>>>>>>>> :
> >>>>>>>>>>
> >>>>>>>>>>> Hi JM,
> >>>>>>>>>>>
> >>>>>>>>>>> Below are my steps to install snappy lib, do I miss something?
> >>>>>>>>>>>
> >>>>>>>>>>> Regards
> >>>>>>>>>>> Arthur
> >>>>>>>>>>>
> >>>>>>>>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >>>>>>>>>>> tar -vxf snappy-1.1.1.tar.gz
> >>>>>>>>>>> cd snappy-1.1.1
> >>>>>>>>>>> ./configure
> >>>>>>>>>>> make
> >>>>>>>>>>> make install
> >>>>>>>>>>>     make[1]: Entering directory
> >>>>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>>>>>>>>     test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >>>>>>>>>>>      /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >>>>>>>>>>> libsnappy.la '/usr/local/lib'
> >>>>>>>>>>>     libtool: install: /usr/bin/install -c
> >>>>> .libs/libsnappy.so.1.2.0
> >>>>>>>>>>> /usr/local/lib/libsnappy.so.1.2.0
> >>>>>>>>>>>     libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 &&
> ln
> >>>>> -s
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >>>>>>>>>>>     libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln
> -s
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> >>>>>>>>>>>     libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >>>>>>>>>>> /usr/local/lib/libsnappy.la
> >>>>>>>>>>>     libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >>>>>>>>>>> /usr/local/lib/libsnappy.a
> >>>>>>>>>>>     libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >>>>>>>>>>>     libtool: install: ranlib /usr/local/lib/libsnappy.a
> >>>>>>>>>>>     libtool: finish:
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >>>>
> >>
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >>>>>>>>>>> ldconfig -n /usr/local/lib
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>
> ----------------------------------------------------------------------
> >>>>>>>>>>>     Libraries have been installed in:
> >>>>>>>>>>>     /usr/local/lib
> >>>>>>>>>>>     If you ever happen to want to link against installed
> >>>>> libraries
> >>>>>>>>>>>     in a given directory, LIBDIR, you must either use libtool,
> >>>>> and
> >>>>>>>>>>>     specify the full pathname of the library, or use the
> >>>>> `-LLIBDIR'
> >>>>>>>>>>>     flag during linking and do at least one of the following:
> >>>>>>>>>>>     - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >>>>>>>>>>>     during execution
> >>>>>>>>>>>     - add LIBDIR to the `LD_RUN_PATH' environment variable
> >>>>>>>>>>>     during linking
> >>>>>>>>>>>     - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >>>>>>>>>>>     - have your system administrator add LIBDIR to
> >>>>> `/etc/ld.so.conf'
> >>>>>>>>>>>     See any operating system documentation about shared
> >>>>> libraries for
> >>>>>>>>>>>     more information, such as the ld(1) and ld.so(8) manual
> >>>>> pages.
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>
> ----------------------------------------------------------------------
> >>>>>>>>>>>     test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >>>>>>>>>>> "/usr/local/share/doc/snappy"
> >>>>>>>>>>>      /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
> >>>>> README
> >>>>>>>>>>> format_description.txt framing_format.txt
> >>>>> '/usr/local/share/doc/snappy'
> >>>>>>>>>>>     test -z "/usr/local/include" || /bin/mkdir -p
> >>>>>>>>> "/usr/local/include"
> >>>>>>>>>>>      /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >>>>>>>>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >>>>>>>>>>>     make[1]: Leaving directory
> >>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>>>>>>>>
> >>>>>>>>>>> ll /usr/local/lib
> >>>>>>>>>>>     -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >>>>>>>>>>>     -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >>>>>>>>>>>     lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
> >>>> ->
> >>>>>>>>>>> libsnappy.so.1.2.0
> >>>>>>>>>>>     lrwxrwxrwx. 1 root root       18 Aug 20 00:14
> >>>> libsnappy.so.1
> >>>>> ->
> >>>>>>>>>>> libsnappy.so.1.2.0
> >>>>>>>>>>>     -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
> >>>>> libsnappy.so.1.2.0
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> >>>>>>>>> [email protected]>
> >>>>>>>>>>> wrote:
> >>>>>>>>>>>
> >>>>>>>>>>>> Hi Arthur,
> >>>>>>>>>>>>
> >>>>>>>>>>>> Do you have snappy libs installed and configured? HBase
> doesn't
> >>>>> come
> >>>>>>>>> with
> >>>>>>>>>>>> Snappy. So yo need to have it first.
> >>>>>>>>>>>>
> >>>>>>>>>>>> Shameless plug:
> >>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >>>>
> >>
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >>>>>>>>>>>>
> >>>>>>>>>>>> This is for 0.96 but should be very similar for 0.98. I will
> try
> >>>>> it
> >>>>>>>>> soon
> >>>>>>>>>>>> and post and update, but keep us posted here so we can support
> >>>>> you...
> >>>>>>>>>>>>
> >>>>>>>>>>>> JM
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> 2014-08-26 7:34 GMT-04:00 [email protected] <
> >>>>>>>>>>> [email protected]
> >>>>>>>>>>>>> :
> >>>>>>>>>>>>
> >>>>>>>>>>>>> Hi,
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
> >>>> is
> >>>>>>>>> 2.4.1)
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Can you please advise what would be wrong?  Should my pom.xml
> >>>> be
> >>>>>>>>>>> incorrect
> >>>>>>>>>>>>> and missing something?
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Regards
> >>>>>>>>>>>>> Arthur
> >>>>>>>>>>>>>
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Below are my commands:
> >>>>>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> >>>>> 0.98.4-hadoop2
> >>>>>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>>>>>>>>>> -Prelease,hadoop-snappy
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Iog:
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [WARNING] The POM for
> >>>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >>>>>>>>>>>>> is missing, no dependency information available
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] Reactor Summary:
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>> [INFO] HBase .............................................
> >>>>> SUCCESS
> >>>>>>>>>>> [3.129s]
> >>>>>>>>>>>>> [INFO] HBase - Common ....................................
> >>>>> SUCCESS
> >>>>>>>>>>> [3.105s]
> >>>>>>>>>>>>> [INFO] HBase - Protocol ..................................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.976s]
> >>>>>>>>>>>>> [INFO] HBase - Client ....................................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.925s]
> >>>>>>>>>>>>> [INFO] HBase - Hadoop Compatibility ......................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.183s]
> >>>>>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.497s]
> >>>>>>>>>>>>> [INFO] HBase - Prefix Tree ...............................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.407s]
> >>>>>>>>>>>>> [INFO] HBase - Server ....................................
> >>>>> FAILURE
> >>>>>>>>>>> [0.103s]
> >>>>>>>>>>>>> [INFO] HBase - Testing Util ..............................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Thrift ....................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Shell .....................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Integration Tests .........................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Examples ..................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Assembly ..................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] BUILD FAILURE
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] Total time: 9.939s
> >>>>>>>>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >>>>>>>>>>>>> [INFO] Final Memory: 61M/2921M
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
> >>>> not
> >>>>>>>>>>> resolve
> >>>>>>>>>>>>> dependencies for project
> >>>>>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >>>>>>>>>>>>> Failure to find
> >>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>>>>>>>>>>> http://maven.oschina.net/content/groups/public/ was cached
> in
> >>>>> the
> >>>>>>>>> local
> >>>>>>>>>>>>> repository, resolution will not be reattempted until the
> update
> >>>>>>>>>>> interval of
> >>>>>>>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>>>>>>>>> [ERROR]
> >>>>>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run
> Maven
> >>>>> with
> >>>>>>>>> the
> >>>>>>>>>>>>> -e switch.
> >>>>>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> >>>>> logging.
> >>>>>>>>>>>>> [ERROR]
> >>>>>>>>>>>>> [ERROR] For more information about the errors and possible
> >>>>> solutions,
> >>>>>>>>>>>>> please read the following articles:
> >>>>>>>>>>>>> [ERROR] [Help 1]
> >>>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >>>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>>>>>>>>> [ERROR]
> >>>>>>>>>>>>> [ERROR] After correcting the problems, you can resume the
> build
> >>>>> with
> >>>>>>>>> the
> >>>>>>>>>>>>> command
> >>>>>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>>>>>>>>
> >>>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> --
> >>>>>>> Sean
> >>>>>>
> >>>>>
> >>>>>
> >>>>
> >>
> >>
>
>

Reply via email to