It's because you committed the job in Windows to a Hadoop cluster running
in Linux. Spark has not yet supported it. See
https://issues.apache.org/jira/browse/SPARK-1825

Best Regards,
Shixiong Zhu

2015-01-28 17:35 GMT+08:00 Marco <marco....@gmail.com>:

> I've created a spark app, which runs fine if I copy the corresponding
> jar to the hadoop-server (where yarn is running) and submit it there.
>
> If it try it to submit it from my local machine, I get the error which
> I've attached below.
> Submit cmd: "spark-submit.cmd --class
> ExamplesHadoop.SparkHbase.TruckEvents  --master yarn-cluster
> .\SparkHbase-1.0-SNAPSHOT-jar-with-dependencies.jar"
>
> Even after raising the time for the yarn log, I have the issue that
> there is no log when I try to get it via yarn logs -applicationId
> myApplicationId
>
> Any hints how I could find the root cause of this issue?
>
> Thanks,
> Marco
>
>
> <<<<<
> 15/01/28 10:25:06 INFO spark.SecurityManager: Changing modify acls to: user
> 15/01/28 10:25:06 INFO spark.SecurityManager: SecurityManager:
> authentication disabled; ui acls disabled; users with view
> permissions: Set(user); users with modify permissions:
> Set((user))
> 15/01/28 10:25:06 INFO yarn.Client: Submitting application 9 to
> ResourceManager
> 15/01/28 10:25:06 INFO impl.YarnClientImpl: Submitted application
> application_1422368366192_0009
> 15/01/28 10:25:07 INFO yarn.Client: Application report for
> application_1422368366192_0009 (state: ACCEPTED)
> 15/01/28 10:25:07 INFO yarn.Client:
>      client token: N/A
>      diagnostics: N/A
>      ApplicationMaster host: N/A
>      ApplicationMaster RPC port: -1
>      queue: default
>      start time: 1422437106550
>      final status: UNDEFINED
>      tracking URL:
> http://server:8088/proxy/application_1422368366192_0009/
>      user: root
> 15/01/28 10:25:08 INFO yarn.Client: Application report for
> application_1422368366192_0009 (state: ACCEPTED)
> 15/01/28 10:25:09 INFO yarn.Client: Application report for
> application_1422368366192_0009 (state: ACCEPTED)
> 15/01/28 10:25:10 INFO yarn.Client: Application report for
> application_1422368366192_0009 (state: ACCEPTED)
> 15/01/28 10:25:11 INFO yarn.Client: Application report for
> application_1422368366192_0009 (state: FAILED)
> 15/01/28 10:25:11 INFO yarn.Client:
>      client token: N/A
>      diagnostics: Application application_1422368366192_0009 failed 2
> times due to AM Container for appattempt_1422368366192_0009_000002
> exited with  exitCode: 1
> For more detailed output, check application tracking
> page:http://server:8088/proxy/application_1422368366192_0009/Then,
> click on links to logs of each attempt.
> Diagnostics: Exception from container-launch.
> Container id: container_1422368366192_0009_02_000001
> Exit code: 1
> Exception message:
>
> /hadoop/yarn/local/usercache/root/appcache/application_1422368366192_0009/container_1422368366192_0009_02_000001/launch_container.sh:
> line 27: %PWD%;%PWD%/__spark__.jar;$H
>
> ADOOP_CONF_DIR;/usr/hdp/current/hadoop-client/*;/usr/hdp/current/hadoop-client/lib/*;/usr/hdp/current/hadoop-hdfs-client/*;/usr/hdp/current/hadoop-hdfs-client/lib/*;/usr/hdp/current/hadoop-y
>
> arn-client/*;/usr/hdp/current/hadoop-yarn-client/lib/*;$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/
>
> hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/shar
>
> e/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure;%PWD%/__app__.jar;%PWD%/*:
> bad substitution
> /bin/bash: line 0: fg: no job control
> Stack trace: ExitCodeException exitCode=1:
>
> /hadoop/yarn/local/usercache/root/appcache/application_1422368366192_0009/container_1422368366192_0009_02_000001/launch_container.sh:
> line 27: %PWD
>
> %;%PWD%/__spark__.jar;$HADOOP_CONF_DIR;/usr/hdp/current/hadoop-client/*;/usr/hdp/current/hadoop-client/lib/*;/usr/hdp/current/hadoop-hdfs-client/*;/usr/hdp/current/hadoop-hdfs-client/lib/*;/
>
> usr/hdp/current/hadoop-yarn-client/*;/usr/hdp/current/hadoop-yarn-client/lib/*;$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr
>
> -framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/
>
> mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure;%PWD%/
> __app__.jar;%PWD%/*: bad substitution
> /bin/bash: line 0: fg: no job control
>     at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>     at org.apache.hadoop.util.Shell.run(Shell.java:455)
>     at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>     at
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
>     at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>     at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:745)
> Container exited with a non-zero exit code 1
> Failing this attempt. Failing the application.
>      ApplicationMaster host: N/A
>      ApplicationMaster RPC port: -1
>      queue: default
>      start time: 1422437106550
>      final status: FAILED
>      tracking URL:
> http://server:8088/cluster/app/application_1422368366192_0009
>      user: root
> Exception in thread "main"
> org.apache.spark.SparkException: Application finished with failed status
>     at
> org.apache.spark.deploy.yarn.ClientBase$class.run(ClientBase.scala:504)
>     at org.apache.spark.deploy.yarn.Client.run(Client.scala:35)
>     at org.apache.spark.deploy.yarn.Client$.main(Client.scala:139)
>     at org.apache.spark.deploy.yarn.Client.main(Client.scala)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:483)
>     at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >>>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to