I am using Spark 2.4 and using createDstream to read from kafka topic. The
topic has messaged written from a transactional producer. 

I am getting the following error 
"requirement failed: Got wrong record for
spark-executor-FtsTopicConsumerGrp7 test11-1 even after seeking to offset 85
got offset 86 instead. If this is a compacted topic, consider enabling
spark.streaming.kafka.allowNonConsecutiveOffsets"


When i enable  spark.streaming.kafka.allowNonConsecutiveOffsets, I am
getting the following error
java.lang.IllegalArgumentException: requirement failed: Failed to get
records for compacted spark-executor-FtsTopicConsumerGrpTESTING_5
fts.analytics-0 after polling for 10000
            at scala.Predef$.require(Predef.scala:224)

Also I set kafka.isolation.level="read_committed".

Anu help on thisw ill be appreciated.






--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to