viirya commented on a change in pull request #31761:
URL: https://github.com/apache/spark/pull/31761#discussion_r591059181
##########
File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
##########
@@ -691,6 +691,15 @@ package object config {
.toSequence
.createWithDefault(Nil)
+ private[spark] val KERBEROS_FILESYSTEM_RENEWAL_EXCLUDE =
+ ConfigBuilder("spark.kerberos.renewal.exclude.hadoopFileSystems")
+ .doc("The list of Hadoop filesystem URLs whose hosts will be excluded
from " +
Review comment:
If I understand correctly, it is the same code path for obtaining
delegation token and renewing. Actually the so called renew is that Spark
periodically obtain delegation token.
So I guess it doesn't make sense for a case to get delegation token but not
renew it. At least in Spark I don't see such case is possible. I may wrong here.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]