Hi,
I was working with non-reliable receiver version of Spark-Kafka streaming
i.e.
KafkaUtils,createStream... where for testing purpose I was getting data at
constant rate from kafka and it was acting as expected.
But when there was exponential data in Kafka, my program started crashing
saying
"Cannot Compute split on input data..." also I found on console logs that
it was adding data continuously in memory while receiving from Kafka.

How Spark Streaming behaves towards exponential data.

Reply via email to