I know this is related to Kafka but it happens during the Spark Structured Streaming job that's why I am asking on this mailing list.
How would you debug this or get around this in Spark Structured Streaming? Any tips would be appreciated. Thanks. java.lang.IllegalStateException: Cannot perform operation after producer has been closed at org.apache.kafka.clients.producer.KafkaProducer.throwIfProducerClosed(KafkaProducer.java:853) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:862) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:846) at org.apache.spark.sql.kafka010.KafkaRowWriter.sendRow(KafkaWriteTask.scala:92) at org.apache.spark.sql.kafka010.KafkaStreamDataWriter.write(KafkaStreamWriter.scala:95)