Hi All,
I am consuming a 8 partition kafka topic through multiple Dstreams and
Processing them in Spark.
But irrespective of multiple InputDstreams the spark master UI is showing
only one receiver.
The following is the consumer part of spark code:
int numStreams = 8;
List> kafkaSt
Hi All,
I am working on a spark streaming job . The job is supposed to read
streaming data from Kafka. But after submitting the job its showing
org.apache.spark.SparkException: Job aborted due to stage failure: All
masters are unresponsive! and the following lines are getting printed
infinitely.