HeartSaVioR commented on a change in pull request #31761: URL: https://github.com/apache/spark/pull/31761#discussion_r592889372
########## File path: core/src/main/scala/org/apache/spark/internal/config/package.scala ########## @@ -691,6 +691,15 @@ package object config { .toSequence .createWithDefault(Nil) + private[spark] val KERBEROS_FILESYSTEM_RENEWAL_EXCLUDE = + ConfigBuilder("spark.kerberos.renewal.exclude.hadoopFileSystems") + .doc("The list of Hadoop filesystem URLs whose hosts will be excluded from " + Review comment: Ah OK I think I missed some part of @tgravescs 's comment. My bad. His point wasn't that the config is not necessary at all. His point was that the problem will occur from only defaultFs and/or stageFs, so the new config could be simplified instead of being general but a bit verbose. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org