gaborgsomogyi commented on a change in pull request #27191: [SPARK-30495][SS] 
Consider spark.security.credentials.kafka.enabled and cluster configuration 
when checking latest delegation token
URL: https://github.com/apache/spark/pull/27191#discussion_r366219240
 
 

 ##########
 File path: 
external/kafka-0-10-token-provider/src/main/scala/org/apache/spark/kafka010/KafkaTokenUtil.scala
 ##########
 @@ -293,11 +293,11 @@ private[spark] object KafkaTokenUtil extends Logging {
   def isConnectorUsingCurrentToken(
       params: ju.Map[String, Object],
       clusterConfig: Option[KafkaTokenClusterConf]): Boolean = {
-    if (params.containsKey(SaslConfigs.SASL_JAAS_CONFIG)) {
+    if 
(SparkEnv.get.conf.getBoolean("spark.security.credentials.kafka.enabled", true) 
&&
 
 Review comment:
   > Is there a case where "spark.security.credentials.kafka.enabled" is set to 
false but "clusterConfig.isDefined" is set to true?
   
   Yes, if one configured the clusters it must be an easy way to turn it off. 
Actually this is stated in the documentation.
   
   > I'm wondering whether we only check "clusterConfig.isDefined" and achieve 
the same thing.
   
   It's not the same because my previous comment.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to