Which version are you using? There is a known issue regarding this and
should be fixed in 2.3.1. See
https://issues.apache.org/jira/browse/SPARK-23623 for details.

Best Regards,
Ryan

On Mon, Jul 2, 2018 at 3:56 AM, kant kodali <kanth...@gmail.com> wrote:

> Hi All,
>
> I get the below error quite often when I do an stream-stream inner join on
> two data frames. After running through several experiments stream-stream
> joins dont look stable enough for production yet. any advice on this?
>
> Thanks!
>
> java.util.ConcurrentModificationException: KafkaConsumer is not safe for
> multi-threaded access
> 18/07/02 09:32:14 INFO LineBufferedStream: stdout:     at
> org.apache.kafka.clients.consumer.KafkaConsumer.
> acquire(KafkaConsumer.java:1431)
> 18/07/02 09:32:14 INFO LineBufferedStream: stdout:     at
> org.apache.kafka.clients.consumer.KafkaConsumer.close(
> KafkaConsumer.java:1361)
> 18/07/02 09:32:14 INFO LineBufferedStream: stdout:     at
> org.apache.spark.sql.kafka010.CachedKafkaConsumer.close(
> CachedKafkaConsumer.scala:301)
>

Reply via email to