Github user arunmahadevan commented on a diff in the pull request:
https://github.com/apache/spark/pull/22598#discussion_r228671263
--- Diff:
core/src/main/scala/org/apache/spark/deploy/security/HadoopDelegationTokenManager.scala
---
@@ -66,7 +66,8 @@ private[spark] class HadoopDelegationTokenManager(
private def getDelegationTokenProviders: Map[String,
HadoopDelegationTokenProvider] = {
val providers = Seq(new HadoopFSDelegationTokenProvider(fileSystems))
++
safeCreateProvider(new HiveDelegationTokenProvider) ++
- safeCreateProvider(new HBaseDelegationTokenProvider)
+ safeCreateProvider(new HBaseDelegationTokenProvider) ++
+ safeCreateProvider(new KafkaDelegationTokenProvider)
--- End diff --
Why I thought disabling by default might make sense -
The tokens fetch would be attempted if just "spark.kafka.bootstrap.servers"
is defined. And if this config is set the spark-sql-kafka libraries needs to be
in the class path as well. Better mention these in the docs.
We could also consider prefixing all the configs with
spark.security.credentials.kafka instead of spark.kafka (like
spark.security.credentials.kafka.bootstrap.servers) to make it explicit that
these are security related settings required for fetching kafka delegation
tokens.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]