Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/5823#discussion_r29486248
--- Diff:
core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala
---
@@ -168,6 +170,26 @@ private[spark] object CoarseGrainedExecutorBackend
extends Logging {
driverConf.set(key, value)
}
}
+ // Delegation Token Updater is not supported in Hadoop 1, so use
reflection.
+ // Can't use Option[ExecutorDelegationTokenUpdater] because it is
built only in YARN
+ // profile, so use Option[Any] since even the stop method call will
be via reflection.
+ var tokenUpdaterOption: Option[Any] = None
+ var tokenUpdaterClass: Option[Class[_]] = None
--- End diff --
I'm wondering if we can create a new function in SparkHadoopUtil and
YarnSparkHadoopUtil. The SparkHadoopUtil version wouldn't do anything and the
YarnSparkHadoopUtil would have this logic. That way its only used on yarn. I
still need to look to see exactly which versions of hadoop support
addCredentials and getCredentials from UserGroupInformation.
haven't looked into exactly how that would work @harishreedharan thoughts?
the downside is we would either have to keep the reference to to the
tokenUpdater in YarnSparkhadoopUtil so we can do the stop on it or return it
back here and still use reflection. I was trying to keep to much stuff from
going in there but it seems like it might be the most convenient place at this
point.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]