[ 
https://issues.apache.org/jira/browse/SPARK-28634?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin reopened SPARK-28634:
------------------------------------

> Failed to start SparkSession with Keytab file 
> ----------------------------------------------
>
>                 Key: SPARK-28634
>                 URL: https://issues.apache.org/jira/browse/SPARK-28634
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 3.0.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> {noformat}
> [user-etl@hermesdevour002-700165 spark-3.0.0-SNAPSHOT-bin-2.7.4]$ 
> bin/spark-sql --master yarn --conf 
> spark.yarn.keytab=/apache/spark-2.3.0-bin-2.7.3/conf/user-etl.keytab --conf 
> spark.yarn.principal=user-...@prod.example.com
> log4j:WARN No such property [maxFileSize] in 
> org.apache.log4j.rolling.RollingFileAppender.
> log4j:WARN No such property [maxBackupIndex] in 
> org.apache.log4j.rolling.RollingFileAppender.
> Exception in thread "main" org.apache.spark.SparkException: Application 
> application_1564558112805_1794 failed 2 times due to AM Container for 
> appattempt_1564558112805_1794_000002 exited with  exitCode: 1
> For more detailed output, check the application tracking page: 
> https://0.0.0.0:8190/applicationhistory/app/application_1564558112805_1794 
> Then click on links to logs of each attempt.
> Diagnostics: Exception from container-launch.
> Container id: container_e1987_1564558112805_1794_02_000001
> Exit code: 1
> Shell output: main : command provided 1
> main : run as user is user-etl
> main : requested yarn user is user-etl
> Getting exit code file...
> Creating script paths...
> Writing pid file...
> Writing to tmp file 
> /hadoop/2/yarn/local/nmPrivate/application_1564558112805_1794/container_e1987_1564558112805_1794_02_000001/container_e1987_1564558112805_1794_02_000001.pid.tmp
> Writing to cgroup task files...
> Creating local dirs...
> Launching container...
> Getting exit code file...
> Creating script paths...
> Container exited with a non-zero exit code 1. Error file: prelaunch.err.
> Last 4096 bytes of prelaunch.err :
> Last 4096 bytes of stderr :
> log4j:WARN No such property [maxFileSize] in 
> org.apache.log4j.rolling.RollingFileAppender.
> log4j:WARN No such property [maxBackupIndex] in 
> org.apache.log4j.rolling.RollingFileAppender.
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/hadoop/2/yarn/local/usercache/user-etl/filecache/58/__spark_libs__4358879230136591830.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/apache/releases/hbase-1.1.2.2.6.4.1/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/apache/releases/hadoop-2.7.3.2.6.4.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> Exception in thread "main" org.apache.spark.SparkException: Keytab file: 
> /apache/spark-2.3.0-bin-2.7.3/conf/user-etl.keytab does not exist
>       at 
> org.apache.spark.deploy.SparkHadoopUtil.loginUserFromKeytab(SparkHadoopUtil.scala:131)
>       at 
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:846)
>       at 
> org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:889)
>       at 
> org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
> Failing this attempt. Failing the application.
>       at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:95)
>       at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
>       at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:185)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
>       at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2466)
>       at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:948)
>       at scala.Option.getOrElse(Option.scala:138)
>       at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:315)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>       at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:853)
>       at 
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:168)
>       at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:196)
>       at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:87)
>       at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:932)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:941)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {noformat}
> The keytab file is exist and can work in Spark 2.3 and 2.4.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to