HeartSaVioR commented on a change in pull request #30366:
URL: https://github.com/apache/spark/pull/30366#discussion_r526810204
##########
File path:
core/src/main/scala/org/apache/spark/deploy/security/HadoopFSDelegationTokenProvider.scala
##########
@@ -126,13 +130,28 @@ private[deploy] class HadoopFSDelegationTokenProvider
Try {
val newExpiration = token.renew(hadoopConf)
val identifier =
token.decodeIdentifier().asInstanceOf[AbstractDelegationTokenIdentifier]
- val interval = newExpiration - identifier.getIssueDate
- logInfo(s"Renewal interval is $interval for token
${token.getKind.toString}")
+ val tokenKind = token.getKind.toString
+ val interval = newExpiration - getIssueDate(tokenKind, identifier)
+ logInfo(s"Renewal interval is $interval for token $tokenKind")
interval
}.toOption
}
if (renewIntervals.isEmpty) None else Some(renewIntervals.min)
}
+
+ private def getIssueDate(kind: String, identifier:
AbstractDelegationTokenIdentifier): Long = {
Review comment:
The reason I was negative on adding test is, as we know how thing was
broken, we basically expect the test to simulate it and verify it's no longer
failing. e.g. testing `obtainDelegationTokens()`. `obtainDelegationTokens()`
calls various private methods which call Hadoop API, hence requiring mocks for
everything. This is actually an existing debt, and non-trivial to pay off,
compared to what I did for the fix.
I'm totally OK to add test verifying this method, as it only requires me to
mock AbstractDelegationTokenIdentifier. I'm not sure this fits the needs of
adding test, though.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]