Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14065#discussion_r71357664
  
    --- Diff: 
yarn/src/main/scala/org/apache/spark/deploy/yarn/security/AMDelegationTokenRenewer.scala
 ---
    @@ -171,10 +174,10 @@ private[yarn] class AMDelegationTokenRenewer(
         keytabLoggedInUGI.doAs(new PrivilegedExceptionAction[Void] {
           // Get a copy of the credentials
           override def run(): Void = {
    -        val nns = YarnSparkHadoopUtil.get.getNameNodesToAccess(sparkConf) 
+ dst
    -        hadoopUtil.obtainTokensForNamenodes(nns, freshHadoopConf, 
tempCreds)
    -        hadoopUtil.obtainTokenForHiveMetastore(sparkConf, freshHadoopConf, 
tempCreds)
    -        hadoopUtil.obtainTokenForHBase(sparkConf, freshHadoopConf, 
tempCreds)
    +        val nearestNextTime = 
credentialManager.obtainCredentials(freshHadoopConf, tempCreds)
    +        require(nearestNextTime > System.currentTimeMillis(),
    +          s"Time of next renewal $nearestNextTime is earlier than now")
    +        timeOfNextRenewal = nearestNextTime
    --- End diff --
    
    driverTokenRenewerRunnable is hardcoded to renew every hour: 
                  delegationTokenRenewer.schedule(this, 1, TimeUnit.HOURS)
    
    We should probably make this configurable.  Ideally nothing needs to be 
renewed that often but if we are wanting to support arbitrary services making 
it configurable would make sense.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to