Github user gaborgsomogyi commented on a diff in the pull request:
https://github.com/apache/spark/pull/22598#discussion_r230683713
--- Diff:
core/src/main/scala/org/apache/spark/deploy/security/HadoopDelegationTokenManager.scala
---
@@ -66,7 +66,8 @@ private[spark] class HadoopDelegationTokenManager(
private def getDelegationTokenProviders: Map[String,
HadoopDelegationTokenProvider] = {
val providers = Seq(new HadoopFSDelegationTokenProvider(fileSystems))
++
safeCreateProvider(new HiveDelegationTokenProvider) ++
- safeCreateProvider(new HBaseDelegationTokenProvider)
+ safeCreateProvider(new HBaseDelegationTokenProvider) ++
+ safeCreateProvider(new KafkaDelegationTokenProvider)
--- End diff --
We've considered turning off by default and came to the conclusion what
Marcelo described which I think still stands.
The PR says documentation is not covered because design can change. My plan
is to add it in a separate PR when the feature merged.
Related what to document I think kafka integration guide should cover all
the things. There it's already covered that the jar should be on the path.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]