Trystan Leftwich created SPARK-15048:
----------------------------------------

             Summary: when running Thriftserver with yarn on a secure cluster 
it will pass the wrong keytab location.
                 Key: SPARK-15048
                 URL: https://issues.apache.org/jira/browse/SPARK-15048
             Project: Spark
          Issue Type: Bug
    Affects Versions: 2.0.0
            Reporter: Trystan Leftwich


when running hive-thriftserver with yarn on a secure cluster it will pass the 
wrong keytab location.

{code}
16/05/01 19:33:52 INFO hive.HiveUtils: Initializing HiveMetastoreConnection 
version 1.2.1 using Spark classes.
Exception in thread "main" org.apache.spark.SparkException: Keytab file: 
test.keytab-e3754e07-c798-4e6a-8745-c5f9d3483507 specified in spark.yarn.keytab 
does not exist
        at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:111)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
        at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:364)
        at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:268)
        at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
        at 
org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
        at 
org.apache.spark.sql.hive.HiveSessionState.metadataHive$lzycompute(HiveSessionState.scala:45)
        at 
org.apache.spark.sql.hive.HiveSessionState.metadataHive(HiveSessionState.scala:45)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:60)
        at 
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:81)
        at 
org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:726)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/05/01 19:33:52 INFO spark.SparkContext: Invoking stop() from shutdown hook
{code}

Note: You will need the patch from SPARK-15046 before you can encounter this 
bug.

It looks like this specific commit caused this issue, 
https://github.com/apache/spark/commit/8301fadd8d269da11e72870b7a889596e3337839#diff-6fd847124f8eae45ba2de1cf7d6296feL93

Re-adding that one line fixes the bug. 
Its possible to "Reset" the config before Hive needs it:

i.e Adding code to similar to below to the following location:
https://github.com/apache/spark/blob/master/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala#L57

{code}
diff --git 
a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala
 
b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala
index 665a44e..0e32b87 100644
--- 
a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala
+++ 
b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala
@@ -55,6 +55,12 @@ private[hive] object SparkSQLEnv extends Logging {
           maybeKryoReferenceTracking.getOrElse("false"))

       sparkContext = new SparkContext(sparkConf)
+      if (sparkConf.contains("spark.yarn.principal")) {
+      sparkContext.conf.set("spark.yarn.principal", 
sparkConf.get("spark.yarn.principal"))
+      }
+      if (sparkConf.contains("spark.yarn.keytab")) {
+      sparkContext.conf.set("spark.yarn.keytab", 
sparkConf.get("spark.yarn.keytab"))
+      }
       sqlContext = SparkSession.withHiveSupport(sparkContext).wrapped
       val sessionState = sqlContext.sessionState.asInstanceOf[HiveSessionState]
       sessionState.metadataHive.setOut(new PrintStream(System.out, true, 
"UTF-8"))
{code}





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to