Only One Kafka receiver is running in spark irrespective of multiple DStreams

2015-01-01 Thread Tapas Swain
Hi All, I am consuming a 8 partition kafka topic through multiple Dstreams and Processing them in Spark. But irrespective of multiple InputDstreams the spark master UI is showing only one receiver. The following is the consumer part of spark code: int numStreams = 8; List> kafkaSt

Error while submitting spark streaming job in YARN

2014-11-20 Thread Tapas Swain
Hi All, I am working on a spark streaming job . The job is supposed to read streaming data from Kafka. But after submitting the job its showing org.apache.spark.SparkException: Job aborted due to stage failure: All masters are unresponsive! and the following lines are getting printed infinitely.