Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20657#discussion_r173383032
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -520,4 +520,16 @@ package object config {
.checkValue(v => v > 0, "The threshold should be positive.")
.createWithDefault(10000000)
+ private[spark] val CREDENTIALS_RENEWAL_INTERVAL_RATIO =
+ ConfigBuilder("spark.security.credentials.renewalRatio")
+ .doc("Ratio of the credential's expiration time when Spark should
fetch new credentials.")
+ .doubleConf
+ .createWithDefault(0.75d)
+
+ private[spark] val CREDENTIALS_RENEWAL_RETRY_WAIT =
+ ConfigBuilder("spark.security.credentials.retryWait")
+ .doc("How long to wait before retrying to fetch new credentials
after a failure.")
+ .timeConf(TimeUnit.SECONDS)
+ .createWithDefaultString("1h")
--- End diff --
Is this "1h" too big if the token expire time is small, for example 8
hours, or even smaller, which will make the next retry directly fail.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]