Hi, 林琰文.


What’s your Hadoop version? 2.x and 3.x may have a different configuration. I 
do some search about this problem. Hope this can help you. 
http://www.aboutyun.com/thread-23759-1-1.html.





发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用



________________________________
发件人: 林琰文 <lyw1124278...@gmail.com>
发送时间: Friday, March 22, 2019 10:47:07 AM
收件人: dev@kylin.apache.org
主题: Re: Build buildSupportsSnappy Error When Doing Integration Testing

Hi Na Zhai,

Thank you for your replying!

Actually I've checked this link previously but it did not work. When I
checked the log output from YARN MapReduce Job container I found that the
*LD_LIBRARY_PATH* was not set correctly and MapReduce log still showed it
could not load native hadoop library. However, I did specify the*
LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native* in
*mapreduce.admin.user.env
*in* mapred-site.xml*. So It was quite weird. My hacked way to tackle this
problem is to add
*-Djava.library.path="${java.library.path}:/usr/hdp/${hdp.version}/hadoop/lib/native
*option in the *mapreduce.admin.map.child.java.opts* in *mapred-site.xml*.
So I don't know why adding *LD_LIBRARY_PATH to the mapreduce.admin.user.env
in mapred-site.xml* does not work*. *Do you have any idea on this?

Thanks again!

Best,
Yanwen Lin

On Thu, Mar 21, 2019 at 10:44 PM Na Zhai <na.z...@kyligence.io> wrote:

> Hi, 林琰文.
>
>
>
> You should add native dependencies. Hope this can help you.
>
>
> https://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
> .
>
>
>
>
>
> 发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用
>
>
>
> ________________________________
> 发件人: 林琰文 <lyw1124278...@gmail.com>
> 发送时间: Tuesday, March 19, 2019 1:04:51 AM
> 收件人: dev@kylin.apache.org
> 主题: Build buildSupportsSnappy Error When Doing Integration Testing
>
> Hi all,
> I am currently running integration test. However, I met the following
> error. Could you please share some suggestions on this?
> *1. Command*:
> mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
> *2. Error message from Yarn Container Attempt:*
>
> 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Accepting Mapper Key with ordinal: 1
>
> 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Do map, available memory: 322m
>
> 2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
> Creating new manager instance of class
> org.apache.kylin.cube.cuboid.CuboidManager
>
> 2019-03-18 16:43:25,599 INFO [main]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
> Committer Algorithm version is 1
>
> 2019-03-18 16:43:25,599 INFO [main]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
> FileOutputCommitter skip cleanup _temporary folders under output
> directory:false, ignore cleanup failures: false
>
> 2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr
> .KylinMapper:
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>
>  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> )
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> )
>
>  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
>
>  at org.apache.hadoop.io
> .SequenceFile$Writer.<init>(SequenceFile.java:1192)
>
>  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> .<init>(SequenceFile.java:1552)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> )
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> )
>
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
>
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
>
> 2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Do cleanup, available memory: 318m
>
> 2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Total rows: 1
>
> 2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
> Error running child : java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>
>  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> )
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> )
>
>  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
>
>  at org.apache.hadoop.io
> .SequenceFile$Writer.<init>(SequenceFile.java:1192)
>
>  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> .<init>(SequenceFile.java:1552)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> )
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> )
>
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
>
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
>
>
> 2019-03-18 16:43:25,926 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics
> system...
>
> 2019-03-18 16:43:25,927 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> stopped.
>
> 2019-03-18 16:43:25,927 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> shutdown complete.
> *3. What I have tried (but not work):*
> Have made sure the following files have the following property:
> 3.1 core-site.xml
> File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
> File: HDP HDFS core-site.xml (via Ambari Web UI)
> <property> <name>io.compression.codecs</name> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
> </value> </property>
> 3.2 mapred-site.xml
> File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
> File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
> <property> <name>mapreduce.map.output.compress</name> <value>true</value>
> </
> property> <property> <name>mapred.map.output.compress.codec</name> <value>
> org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <
> name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
> 3.0.1.0-187/hadoop/lib/native</value> </property>
> 3.3 libsnappy.so
> Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
> /hadoop/lib/native
>
> Thanks!
>
> Best,
> Yanwen Lin
>

Reply via email to