gaborgsomogyi commented on a change in pull request #23956: [SPARK-27042][SS]
Close cached Kafka producer in case of task retry
URL: https://github.com/apache/spark/pull/23956#discussion_r262831004
##########
File path:
external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
##########
@@ -79,6 +79,10 @@ private[kafka010] object CachedKafkaProducer extends
Logging {
* one instance per specified kafkaParams.
*/
private[kafka010] def getOrCreate(kafkaParams: ju.Map[String, Object]):
Producer = {
+ if (TaskContext.get != null && TaskContext.get.attemptNumber >= 1) {
+ logDebug(s"Reattempt detected, invalidating cached producer for params
$kafkaParams")
+ close(kafkaParams)
Review comment:
> The new created producer can be closed by an attempt of a different task
at once.
Also think it's a good point. Some sort of `if (!inUse) close()` mechanism
would be correct.
@zsxwing Just for the sake of my deeper understanding in which scenario can
happen that a 2 tasks in the same executor are writing the same topicpartition?
@ScrapCodes are you proceeding with SPARK-21869? This PR needs the `inUse`
flag what you've shown in #19096. Happy to help any way.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]