这个参数配置了么,*kylin.engine.spark-conf.spark.yarn.archive,* 是否配置为自己集群所在的hdfs目录,而不是文档上的hdfs://sandbox.hortonworks.com:8020
On Wed, Aug 14, 2019 at 2:37 PM 邵志鹏 <bobr...@163.com> wrote: > 2019-08-14T14:31:09,188 INFO [http-bio-7070-exec-7] > org.apache.kylin.cube.CubeManager - Updating cube instance > 'kylin_sales_cube_clone_spark' > 2019-08-14T14:31:09,188 INFO [http-bio-7070-exec-7] > org.apache.kylin.cube.CubeManager - Remove segment > kylin_sales_cube_clone_spark[20120101000000_20140201235500] > > Action - Delete Segement can Remove segment , and then rebuild it. > > Main problem is > Caused by: java.lang.ClassNotFoundException: > org.apache.spark.api.java.function.Function。 > > 不确定是否是jar报冲突,spark下面所有的jar都一起打包放到hdfs了,里面有spark-core。 > spark版本:2.3.2 > spark-core_2.11-2.3.2 > > > 在 2019-08-14 14:08:03,"邵志鹏" <bobr...@163.com> 写道: > > Hi kylin team: > > kylin-2.6.3 > 使用MapReduce已经成功构建smaple cube。 > 然后clone了一份kylin_sales_cube修改为Spark,build 报错。 > > Spark是使用脚本下载的,配置参考了 > http://kylin.apache.org/cn/docs/tutorial/cube_spark.html。 > > > $KYLIN_HOME/bin/download-spark.sh > jar cv0f spark-libs.jar -C $KYLIN_HOME/spark/jars/ . > hadoop fs -mkdir -p /kylin/spark/ > hadoop fs -put spark-libs.jar /kylin/spark/ > ... > > 报错日志: > Caused by: java.lang.NoClassDefFoundError: > org/apache/spark/api/java/function/Function > at > org.apache.kylin.engine.spark.SparkBatchCubingJobBuilder2.<init>(SparkBatchCubingJobBuilder2.java:53) > ~[kylin-engine-spark-2.6.3.jar:2.6.3] > at > org.apache.kylin.engine.spark.SparkBatchCubingEngine2.createBatchCubingJob(SparkBatchCubingEngine2.java:44) > ~[kylin-engine-spark-2.6.3.jar:2.6.3] > at > org.apache.kylin.engine.EngineFactory.createBatchCubingJob(EngineFactory.java:60) > ~[kylin-core-job-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.service.JobService.submitJobInternal(JobService.java:234) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.service.JobService.submitJob(JobService.java:202) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.controller.CubeController.buildInternal(CubeController.java:395) > ~[kylin-server-base-2.6.3.jar:2.6.3] > ... 77 more > Caused by: java.lang.ClassNotFoundException: > org.apache.spark.api.java.function.Function > at > org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1928) > ~[catalina.jar:7.0.91] > at > org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1771) > ~[catalina.jar:7.0.91] > at > org.apache.kylin.engine.spark.SparkBatchCubingJobBuilder2.<init>(SparkBatchCubingJobBuilder2.java:53) > ~[kylin-engine-spark-2.6.3.jar:2.6.3] > at > org.apache.kylin.engine.spark.SparkBatchCubingEngine2.createBatchCubingJob(SparkBatchCubingEngine2.java:44) > ~[kylin-engine-spark-2.6.3.jar:2.6.3] > at > org.apache.kylin.engine.EngineFactory.createBatchCubingJob(EngineFactory.java:60) > ~[kylin-core-job-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.service.JobService.submitJobInternal(JobService.java:234) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.service.JobService.submitJob(JobService.java:202) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.controller.CubeController.buildInternal(CubeController.java:395) > ~[kylin-server-base-2.6.3.jar:2.6.3] > > > 报错之后,再去提交build,又报新的错: > Caused by: org.apache.kylin.rest.exception.BadRequestException: The cube > kylin_sales_cube_clone_spark has segments > [kylin_sales_cube_clone_spark[20120101000000_20140201235500]], but none of > them is READY. It's not allowed for parallel building > at > org.apache.kylin.rest.service.JobService.checkAllowParallelBuilding(JobService.java:422) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.service.JobService.submitJobInternal(JobService.java:221) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.service.JobService.submitJob(JobService.java:202) > ~[kylin-server-base-2.6.3.jar:2.6.3] > at > org.apache.kylin.rest.controller.CubeController.buildInternal(CubeController.java:395) > ~[kylin-server-base-2.6.3.jar:2.6.3] > ... 77 more > > > Monitor里面是没有新的构建任务的。 > > Thanks. > > By 邵志鹏 > > > > > > > > >