Hi all,
I am currently running integration test. However, I met the following
error. Could you please share some suggestions on this?
*1. Command*:
mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
*2. Error message from Yarn Container Attempt:*
2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Accepting Mapper Key with ordinal: 1
2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Do map, available memory: 322m
2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
Creating new manager instance of class
org.apache.kylin.cube.cuboid.CuboidManager
2019-03-18 16:43:25,599 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
Committer Algorithm version is 1
2019-03-18 16:43:25,599 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
FileOutputCommitter skip cleanup _temporary folders under output
directory:false, ignore cleanup failures: false
2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr.KylinMapper:
java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)
at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
.<init>(SequenceFile.java:1552)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
at
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
at
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
)
at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
)
at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Do cleanup, available memory: 318m
2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Total rows: 1
2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
Error running child : java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)
at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
)
at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
)
at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
)
at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)
at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
.<init>(SequenceFile.java:1552)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
at
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
at
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
)
at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
)
at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
2019-03-18 16:43:25,926 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics
system...
2019-03-18 16:43:25,927 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
stopped.
2019-03-18 16:43:25,927 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
shutdown complete.
*3. What I have tried (but not work):*
Have made sure the following files have the following property:
3.1 core-site.xml
File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
File: HDP HDFS core-site.xml (via Ambari Web UI)
<property> <name>io.compression.codecs</name> <value>
org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
</value> </property>
3.2 mapred-site.xml
File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
<property> <name>mapreduce.map.output.compress</name> <value>true</value> </
property> <property> <name>mapred.map.output.compress.codec</name> <value>
org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <
name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
3.0.1.0-187/hadoop/lib/native</value> </property>
3.3 libsnappy.so
Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
/hadoop/lib/native
Thanks!
Best,
Yanwen Lin