Hi guys: Is there someone use newest Kylin and try open dashboard? I had try this new feature, by the default the system cube use MR mode, build and refresh action no any problems. But when I change to spark mode not success. The error logs see below:
OS command error exit with return code: 1, error message: log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 18/03/28 18:03:30 INFO Client: Requesting a new application from cluster with 3 NodeManagers 18/03/28 18:03:30 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 18/03/28 18:03:30 INFO Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead 18/03/28 18:03:30 INFO Client: Setting up container launch context for our AM 18/03/28 18:03:30 INFO Client: Setting up the launch environment for our AM container 18/03/28 18:03:30 INFO Client: Preparing resources for our AM container 18/03/28 18:03:32 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 18/03/28 18:03:34 INFO Client: Uploading resource file:/tmp/spark-e9a0636e-6d6e-476a-ad62-dfcb1e4d6102/__spark_libs__437322364481459969.zip -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_1522141683864_0091/__spark_libs__437322364481459969.zip 18/03/28 18:03:38 INFO Client: Uploading resource file:/opt/apache-kylin-2.3.0/lib/kylin-job-2.3.0.jar -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_1522141683864_0091/kylin-job-2.3.0.jar 18/03/28 18:03:39 INFO Client: Uploading resource file:/data/hdp/2.6.4.0-91/hbase/lib/htrace-core-3.1.0-incubating.jar -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_1522141683864_0091/htrace-core-3.1.0-incubating.jar 18/03/28 18:03:39 INFO Client: Uploading resource file:/data/hdp/2.6.4.0-91/hbase/lib/metrics-core-2.2.0.jar -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_1522141683864_0091/metrics-core-2.2.0.jar 18/03/28 18:03:39 INFO Client: Uploading resource file:/data/hdp/2.6.4.0-91/hbase/lib/guava-12.0.1.jar -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_1522141683864_0091/guava-12.0.1.jar 18/03/28 18:03:40 INFO Client: Uploading resource file:/tmp/spark-e9a0636e-6d6e-476a-ad62-dfcb1e4d6102/__spark_conf__5932514761984685807.zip -> hdfs://xxx:8020/user/hdfs/.sparkStaging/application_1522141683864_0091/__spark_conf__.zip 18/03/28 18:03:40 WARN Client: spark.yarn.am.extraJavaOptions will not take effect in cluster mode 18/03/28 18:03:40 INFO SecurityManager: Changing view acls to: hdfs 18/03/28 18:03:40 INFO SecurityManager: Changing modify acls to: hdfs 18/03/28 18:03:40 INFO SecurityManager: Changing view acls groups to: 18/03/28 18:03:40 INFO SecurityManager: Changing modify acls groups to: 18/03/28 18:03:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hdfs); groups with view permissions: Set(); users with modify permissions: Set(hdfs); groups with modify permissions: Set() 18/03/28 18:03:40 INFO Client: Submitting application application_1522141683864_0091 to ResourceManager 18/03/28 18:03:40 INFO YarnClientImpl: Submitted application application_1522141683864_0091 18/03/28 18:03:41 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:41 INFO Client: client token: N/A diagnostics: AM container is launched, waiting for AM container to Register with RM ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1522231420490 final status: UNDEFINED tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/ user: hdfs 18/03/28 18:03:42 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:43 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:44 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:45 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:46 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:47 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:48 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:49 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:50 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:51 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:52 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:03:53 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:03:53 INFO Client: client token: N/A diagnostics: N/A ApplicationMaster host: xxx ApplicationMaster RPC port: 0 queue: default start time: 1522231420490 final status: UNDEFINED tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/ user: hdfs 18/03/28 18:03:54 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:03:55 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:03:56 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:03:57 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:03:58 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:03:59 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:00 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:01 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:02 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:03 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:04 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:05 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:06 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:07 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:08 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:09 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:10 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:11 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:12 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:13 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:14 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:15 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:16 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:16 INFO Client: client token: N/A diagnostics: AM container is launched, waiting for AM container to Register with RM ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1522231420490 final status: UNDEFINED tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/ user: hdfs 18/03/28 18:04:17 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:18 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:19 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:20 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:21 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:22 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:23 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:25 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:26 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:27 INFO Client: Application report for application_1522141683864_0091 (state: ACCEPTED) 18/03/28 18:04:28 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:28 INFO Client: client token: N/A diagnostics: N/A ApplicationMaster host: xxx ApplicationMaster RPC port: 0 queue: default start time: 1522231420490 final status: UNDEFINED tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/ user: hdfs 18/03/28 18:04:29 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:30 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:31 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:32 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:33 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:34 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:35 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:36 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:37 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:38 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:39 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:40 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:41 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:42 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:43 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:44 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:45 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:46 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:47 INFO Client: Application report for application_1522141683864_0091 (state: RUNNING) 18/03/28 18:04:48 INFO Client: Application report for application_1522141683864_0091 (state: FINISHED) 18/03/28 18:04:48 INFO Client: client token: N/A diagnostics: User class threw exception: java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer ApplicationMaster host: xxx ApplicationMaster RPC port: 0 queue: default start time: 1522231420490 final status: FAILED tracking URL: http://xxx:8088/proxy/application_1522141683864_0091/ user: hdfs Exception in thread "main" org.apache.spark.SparkException: Application application_1522141683864_0091 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1180) at org.apache.spark.deploy.yarn.Client\$.main(Client.scala:1226) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit\$.org\$apache\$spark\$deploy\$SparkSubmit\$\$runMain(SparkSubmit.scala:744) at org.apache.spark.deploy.SparkSubmit\$.doRunMain\$1(SparkSubmit.scala:187) at org.apache.spark.deploy.SparkSubmit\$.submit(SparkSubmit.scala:212) at org.apache.spark.deploy.SparkSubmit\$.main(SparkSubmit.scala:126) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 18/03/28 18:04:48 INFO ShutdownHookManager: Shutdown hook called 18/03/28 18:04:48 INFO ShutdownHookManager: Deleting directory /tmp/spark-e9a0636e-6d6e-476a-ad62-dfcb1e4d6102 The command is: export HADOOP_CONF_DIR=/opt/apache-kylin-2.3.0/hadoop-conf && /opt/apache-kylin-2.3.0/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry --conf spark.executor.instances=1 --conf spark.yarn.queue=default --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history --conf spark.driver.extraJavaOptions=-Dhdp.version=current --conf spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec --conf spark.master=yarn --conf spark.executor.extraJavaOptions=-Dhdp.version=current --conf spark.hadoop.yarn.timeline-service.enabled=true --conf spark.executor.memory=1G --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs:///kylin/spark-history --conf spark.executor.cores=2 --conf spark.submit.deployMode=cluster --jars /data/hdp/2.6.4.0-91/hbase/lib/htrace-core-3.1.0-incubating.jar,/data/hdp/2.6.4.0-91/hbase/lib/metrics-core-2.2.0.jar,/data/hdp/2.6.4.0-91/hbase/lib/guava-12.0.1.jar, /opt/apache-kylin-2.3.0/lib/kylin-job-2.3.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable kylin_tmp.kylin_intermediate_kylin_hive_metrics_job_qa_912467dc_7a1c_42ce_8ff5_60c2a5b11441 -output hdfs://xxx:8020/kylin/kylin_metadata/kylin-f943d829-3436-4816-bfa6-da2234401862/KYLIN_HIVE_METRICS_JOB_QA/cuboid/ -segmentId 912467dc-7a1c-42ce-8ff5-60c2a5b11441 -metaUrl kylin_metadata@hdfs,path=hdfs://xxx:8020/kylin/kylin_metadata/metadata/912467dc-7a1c-42ce-8ff5-60c2a5b11441 -cubename KYLIN_HIVE_METRICS_JOB_QA I not sure whether is my HDP environment problem, but I can sure the sample cube running spark mode well. And there had someone bring forward same issues but not earn resolve. So please tell me how to do? PS: I can't find more detail info about this exception. Only cat from kylin.log file. If you need more things, tell what steps. Thanks.