Hi

I took a look at the Internet and found these links, take a try and hope it
helps.

https://community.hortonworks.com/questions/23699/bad-substitution-error-running-spark-on-yarn.html

https://stackoverflow.com/questions/32341709/bad-substitution-when-submitting-spark-job-to-yarn-cluster

-- 


Regards!

Aron Tao



Kang-Sen Lu <k...@anovadata.com> 于2018年11月29日周四 下午3:11写道:

> We are running kylin 2.5.1. For a specific cube created, the cube build
> for one hour of data took 200 minutes. So I am thinking about building cube
> with spark, instead of map-reduce.
>
>
>
> I selected spark in the cube design, advanced setting.
>
>
>
> The cube build failed at step 3, with the following error log:
>
>
>
> OS command error exit with return code: 1, error message: 18/11/29
> 09:50:33 INFO client.RMProxy: Connecting to ResourceManager at
> anovadata6.anovadata.local/192.168.230.199:8050
>
> 18/11/29 09:50:33 INFO yarn.Client: Requesting a new application from
> cluster with 1 NodeManagers
>
> 18/11/29 09:50:33 INFO yarn.Client: Verifying our application has not
> requested more than the maximum memory capability of the cluster (191488 MB
> per container)
>
> 18/11/29 09:50:33 INFO yarn.Client: Will allocate AM container, with 2432
> MB memory including 384 MB overhead
>
> 18/11/29 09:50:33 INFO yarn.Client: Setting up container launch context
> for our AM
>
> 18/11/29 09:50:33 INFO yarn.Client: Setting up the launch environment for
> our AM container
>
> 18/11/29 09:50:33 INFO yarn.Client: Preparing resources for our AM
> container
>
> 18/11/29 09:50:35 WARN yarn.Client: Neither spark.yarn.jars nor
> spark.yarn.archive is set, falling back to uploading libraries under
> SPARK_HOME.
>
> 18/11/29 09:50:38 INFO yarn.Client: Uploading resource
> file:/tmp/spark-507691d4-f131-4bc5-bf6c-c8ff7606e201/__spark_libs__6261254232609828730.zip
> ->
> hdfs://anovadata6.anovadata.local:8020/user/zettics/.sparkStaging/application_1543422353836_0088/__spark_libs__6261254232609828730.zip
>
> 18/11/29 09:50:39 INFO yarn.Client: Uploading resource
> file:/home/zettics/kylin/apache-kylin-2.5.1-anovadata-bin/lib/kylin-job-2.5.1-anovadata.jar
> ->
> hdfs://anovadata6.anovadata.local:8020/user/zettics/.sparkStaging/application_1543422353836_0088/kylin-job-2.5.1-anovadata.jar
>
> 18/11/29 09:50:39 WARN yarn.Client: Same path resource
> file:/home/zettics/kylin/apache-kylin-2.5.1-anovadata-bin/lib/kylin-job-2.5.1-anovadata.jar
> added multiple times to distributed cache.
>
> 18/11/29 09:50:39 INFO yarn.Client: Uploading resource
> file:/tmp/spark-507691d4-f131-4bc5-bf6c-c8ff7606e201/__spark_conf__1525388499029792228.zip
> ->
> hdfs://anovadata6.anovadata.local:8020/user/zettics/.sparkStaging/application_1543422353836_0088/__spark_conf__.zip
>
> 18/11/29 09:50:39 WARN yarn.Client: spark.yarn.am.extraJavaOptions will
> not take effect in cluster mode
>
> 18/11/29 09:50:39 INFO spark.SecurityManager: Changing view acls to:
> zettics
>
> 18/11/29 09:50:39 INFO spark.SecurityManager: Changing modify acls to:
> zettics
>
> 18/11/29 09:50:39 INFO spark.SecurityManager: Changing view acls groups
> to:
>
> 18/11/29 09:50:39 INFO spark.SecurityManager: Changing modify acls groups
> to:
>
> 18/11/29 09:50:39 INFO spark.SecurityManager: SecurityManager:
> authentication disabled; ui acls disabled; users  with view permissions:
> Set(zettics); groups with view permissions: Set(); users  with modify
> permissions: Set(zettics); groups with modify permissions: Set()
>
> 18/11/29 09:50:39 INFO yarn.Client: Submitting application
> application_1543422353836_0088 to ResourceManager
>
> 18/11/29 09:50:39 INFO impl.YarnClientImpl: Submitted application
> application_1543422353836_0088
>
> 18/11/29 09:50:40 INFO yarn.Client: Application report for
> application_1543422353836_0088 (state: ACCEPTED)
>
> 18/11/29 09:50:40 INFO yarn.Client:
>
>          client token: N/A
>
>         diagnostics: AM container is launched, waiting for AM container to
> Register with RM
>
>         ApplicationMaster host: N/A
>
>         ApplicationMaster RPC port: -1
>
>         queue: default
>
>         start time: 1543503039903
>
>         final status: UNDEFINED
>
>         tracking URL:
> http://anovadata6.anovadata.local:8088/proxy/application_1543422353836_0088/
>
>         user: zettics
>
> 18/11/29 09:50:41 INFO yarn.Client: Application report for
> application_1543422353836_0088 (state: ACCEPTED)
>
> 18/11/29 09:50:42 INFO yarn.Client: Application report for
> application_1543422353836_0088 (state: ACCEPTED)
>
> 18/11/29 09:50:43 INFO yarn.Client: Application report for
> application_1543422353836_0088 (state: FAILED)
>
> 18/11/29 09:50:43 INFO yarn.Client:
>
>          client token: N/A
>
>         diagnostics: Application application_1543422353836_0088 failed 2
> times due to AM Container for appattempt_1543422353836_0088_000002 exited
> with  exitCode: 1
>
> For more detailed output, check the application tracking page:
> http://anovadata6.anovadata.local:8088/cluster/app/application_1543422353836_0088
> Then click on links to logs of each attempt.
>
> Diagnostics: Exception from container-launch.
>
> Container id: container_e05_1543422353836_0088_02_000001
>
> Exit code: 1
>
> Exception message:
> /hadoop/yarn/local/usercache/zettics/appcache/application_1543422353836_0088/container_e05_1543422353836_0088_02_000001/launch_container.sh:
> line 26:
> $PWD:$PWD/__spark_conf__:$PWD/__spark_libs__/*:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:
> bad substitution
>
>
>
> Stack trace: ExitCodeException exitCode=1:
> /hadoop/yarn/local/usercache/zettics/appcache/application_1543422353836_0088/container_e05_1543422353836_0088_02_000001/launch_container.sh:
> line 26:
> $PWD:$PWD/__spark_conf__:$PWD/__spark_libs__/*:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:
> bad substitution
>
>
>
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:944)
>
>         at org.apache.hadoop.util.Shell.run(Shell.java:848)
>
>         at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1142)
>
>         at
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:237)
>
>         at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)
>
>         at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)
>
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
>
> Thanks.
>
>
>
> Kang-sen
>

Reply via email to