HeartSaVioR commented on a change in pull request #25760: [SPARK-29054][SS] Invalidate Kafka consumer when new delegation token available URL: https://github.com/apache/spark/pull/25760#discussion_r329331616
########## File path: external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/KafkaDataConsumer.scala ########## @@ -46,6 +47,10 @@ private[kafka010] class InternalKafkaConsumer( val groupId = kafkaParams.get(ConsumerConfig.GROUP_ID_CONFIG).asInstanceOf[String] + // These fields must be updated whenever a new consumer is created. + private[kafka010] var clusterConfig: Option[KafkaTokenClusterConf] = _ Review comment: Does we assume cluster configurations in SparkConf are changing dynamically? If not, the matched cluster config will not be changed and be retrieved once here. (with changing it to `val`) ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
