thanks ,na.
the reason is on the server where I deploy kylin , did't have libsnappy.so** 
files

2019-03-14 

lk_hadoop 



发件人:Na Zhai <[email protected]>
发送时间:2019-03-13 23:22
主题: 答复: kylin fail to Load HFile to HBase when use snappy
收件人:"[email protected]"<[email protected]>
抄送:

Hi, lk_hadoop
 
Which property does you used? ‘kylin.hbase.default.compression.codec’ or 
‘kylin.storage.hbase.compression-codec’? Hope this can help you. 
https://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec.
 
 
发送自 Windows 10 版邮件应用
 



发件人: lk_hadoop <[email protected]>
发送时间: Wednesday, March 13, 2019 5:51:48 PM
收件人: user
主题: kylin fail to Load HFile to HBase when use snappy 

hi,all:
   I'm using kylin2.5.0 upon hbase-1.2.0-cdh5.14.0  whit snappy compress, 
following the doc : 
http://kylin.apache.org/cn/docs/install/configuration.html#compress-config 
   I build a cube , when step to :  #12 Step Name: Load HFile to HBase Table I 
got error:
   org.apache.kylin.engine.mr.exception.HadoopShellException: 
org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem reading HFile 
Trailer from file 
hdfs://nameservice1/system/kylin/kylin_metadata/kylin-f8cd3f93-fa27-62af-d03a-15d8be13c929/st_sellgood_goods_detail_cube/hfile/F2/9f69ef6b124241679c876dd52465a900
        at org.apache.hadoop.hbase.io.hfile.HFile.openReader(HFile.java:503)
        at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:551)
        at 
org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:681)
        at 
org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$3.call(LoadIncrementalHFiles.java:586)
        at 
org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$3.call(LoadIncrementalHFiles.java:583)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.UnsatisfiedLinkError: 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
        at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
Method)
        at 
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
        at 
org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:195)
        at 
org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:181)
        at 
org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getDecompressor(Compression.java:328)
        at 
org.apache.hadoop.hbase.io.compress.Compression.decompress(Compression.java:423)
        at 
org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultDecodingContext.prepareDecoding(HFileBlockDefaultDecodingContext.java:90)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock.unpack(HFileBlock.java:554)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1395)
        at 
org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1401)
        at 
org.apache.hadoop.hbase.io.hfile.HFileReaderV2.<init>(HFileReaderV2.java:150)
        at org.apache.hadoop.hbase.io.hfile.HFile.openReader(HFile.java:491)
        ... 8 more

result code:2
        at 
org.apache.kylin.engine.mr.common.HadoopShellExecutable.doWork(HadoopShellExecutable.java:73)
        at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:163)
        at 
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:69)
        at 
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:163)
        at 
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:113)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

I have tested that hbase can create table with snappy compress.
need your help !

2019-03-13


lk_hadoop 

Reply via email to