Hi,

I am having a weird situation where the below command works when the
deploy mode is a client and fails if it is a cluster.

spark-submit --master yarn --deploy-mode client --files
/etc/hive/conf/hive-site.xml,/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml
--driver-memory 70g --num-executors 6 --executor-cores 3 --driver-cores 3
--driver-memory 7g --py-files /appl/common/ftp/ftp_event_data.py
 /appl/common/ftp/ftp_event_data.py /appl/common/ftp/conf.json 2021-05-10 7



21/05/14 17:34:39 INFO ApplicationMaster: Waiting for spark context
initialization...
21/05/14 17:34:39 WARN SparkConf: The configuration key
'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3
and may be removed in the future. Please use the new key
'spark.executor.memoryOverhead' instead.
21/05/14 17:34:39 ERROR ApplicationMaster: User application exited with
status 1
21/05/14 17:34:39 INFO ApplicationMaster: Final app status: FAILED,
exitCode: 13, (reason: User application exited with status 1)
21/05/14 17:34:39 ERROR ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
        at
org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:226)
        at
org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447)
        at
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:799)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:798)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:798)
        at
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: org.apache.spark.SparkUserAppException: User application exited
with 1
        at
org.apache.spark.deploy.PythonRunner$.main(PythonRunner.scala:106)
        at org.apache.spark.deploy.PythonRunner.main(PythonRunner.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:667)
21/05/14 17:34:39 INFO ApplicationMaster: Deleting staging directory
hdfs://dev-cbb-datalake/user/nifiuser/.sparkStaging/application_1620318563358_0046
21/05/14 17:34:41 INFO ShutdownHookManager: Shutdown hook called


For more detailed output, check the application tracking page:
https://srvbigddvlsh115.us.dev.corp:8090/cluster/app/application_1620318563358_0046
Then click on links to logs of each attempt.
. Failing the application.
Exception in thread "main" org.apache.spark.SparkException: Application
application_1620318563358_0046 finished with failed status
        at org.apache.spark.deploy.yarn.Client.run(Client.scala:1155)
        at
org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1603)
        at org.apache.spark.deploy.SparkSubmit.org
$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
        at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
21/05/14 17:34:42 INFO util.ShutdownHookManager: Shutdown hook called
21/05/14 17:34:42 INFO util.ShutdownHookManager: Deleting directory
/tmp/spark-28fa7d64-5a1d-42fb-865f-e9bb24854e7c
21/05/14 17:34:42 INFO util.ShutdownHookManager: Deleting directory
/tmp/spark-db93f731-d48a-4a7b-986f-e0a016bbd7f3

Thanks,
Asmath

Reply via email to