HeartSaVioR commented on a change in pull request #27191: [SPARK-30495][SS]
Consider spark.security.credentials.kafka.enabled and cluster configuration
when checking latest delegation token
URL: https://github.com/apache/spark/pull/27191#discussion_r366310601
##########
File path:
external/kafka-0-10-token-provider/src/main/scala/org/apache/spark/kafka010/KafkaTokenUtil.scala
##########
@@ -293,11 +293,11 @@ private[spark] object KafkaTokenUtil extends Logging {
def isConnectorUsingCurrentToken(
params: ju.Map[String, Object],
clusterConfig: Option[KafkaTokenClusterConf]): Boolean = {
- if (params.containsKey(SaslConfigs.SASL_JAAS_CONFIG)) {
+ if
(SparkEnv.get.conf.getBoolean("spark.security.credentials.kafka.enabled", true)
&&
Review comment:
OK got it. It would be better if we have sparkConf as a parameter, like
other methods in this class. It will help to avoid bothering with NPE issue due
to SparkEnv.get being null in test suite, as well as easier to inject Spark
configuration during tests.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]