Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/17335
@subrotosanyal
I was able to write some code that should work for your use case even
without the fix for SPARK-15754. I reverted that change and ran the following
code a few times in the same JVM:
```
PrivilegedExceptionAction<Void> action = () -> {
dumpTokens("before");
runSpark();
dumpTokens("after");
return null;
};
UserGroupInformation ugi =
UserGroupInformation.loginUserFromKeytabAndReturnUGI(principal, keytab);
ugi.doAs(action);
```
(Where `dumpTokens` prints the tokens in the UGI, and `runSpark` starts a
SparkContext and stops it.)
Each iteration starts with no tokens and finishes with an HDFS delegation
token, so it seems to have the behavior you want.
With that being said, if reverting the fix for SPARK-15754 fixes the Hive
token issue, we should probably do that since there seems to be a way for
things to work in the embedded case.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]