GitHub user LantaoJin opened a pull request:
https://github.com/apache/spark/pull/21396
[SPARK-24349][SQL] Ignore setting token if using JDBC
## What changes were proposed in this pull request?
In [SPARK-23639](https://issues.apache.org/jira/browse/SPARK-23639), use
--proxy-user to impersonate will invoke obtainDelegationTokens(), but if my
Driver use JDBC instead of metastore, it will exit with
> WARN HiveConf: HiveConf of name hive.server2.enable.impersonation does
not exist
Exception in thread "main" java.lang.IllegalArgumentException: requirement
failed: Hive metastore uri undefined
at scala.Predef$.require(Predef.scala:224)
at
org.apache.spark.sql.hive.thriftserver.HiveCredentialProvider.obtainCredentials(HiveCredentialProvider.scala:73)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:56)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:288)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:169)
at
org.apache.spark.deploy.SparkSubmit$$anon$1.run(SparkSubmit.scala:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/05/22 05:24:16 INFO ShutdownHookManager: Shutdown hook called
18/05/22 05:24:16 INFO ShutdownHookManager: Deleting directory
/tmp/spark-b63ad788-1a47-4326-9972-c4fde1dc19c3
{code}
## How was this patch tested?
Remove or comment out the configuration **hive.metastore.uris** in
hive-site.xml (Using JDBC to connect DB directly)
Below command will failed:
> bin/spark-sql --proxy-user x_user --master local
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/LantaoJin/spark SPARK-24349
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/21396.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #21396
----
commit eabf1b3ec82f4f6c5484b11557b72092f161256d
Author: LantaoJin <jinlantao@...>
Date: 2018-05-22T14:48:24Z
[SPARK-24349][SQL] Ignore setting token if using JDBC
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]