1. Of course, a single block / partition has many Kafka messages, and from different Kafka topics interleaved together. The message count is not related to the block count. Any message received within a particular block interval will go in the same block.
2. Yes, the receiver will be started on another worker. TD On Tue, Dec 30, 2014 at 2:19 PM, SamyaMaiti <samya.maiti2...@gmail.com> wrote: > Hi Experts, > > Few general Queries : > > 1. Can a single block/partition in a RDD have more than 1 kafka message? or > there will be one & only one kafka message per block? In a more broader way, > is the message count related to block in any way or its just that any > message received with in a particular block interval will go in the same > block. > > 2. If a worker goes down which runs the Receiver for Kafka, Will the > receiver be restarted on some other worker? > > Regards, > Sam > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Kafka-Spark-streaming-tp20914.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org