Hello,
@Akhil Das I'm trying to use the experimental API
https://github.com/apache/spark/blob/master/examples/scala-2.10/src/main/scala/org/apache/spark/examples/streaming/DirectKafkaWordCount.scala
Can you show us the output of DStream#print() if you have it ?
Thanks
On Tue, Mar 31, 2015 at 2:55 AM, Nicolas Phung nicolas.ph...@gmail.com
wrote:
Hello,
@Akhil Das I'm trying to use the experimental API
Nicolas:
See if there was occurrence of the following exception in the log:
errs = throw new SparkException(
sCouldn't connect to leader for topic ${part.topic}
${part.partition}: +
errs.mkString(\n)),
Cheers
On Mon, Mar 30, 2015 at 9:40 AM, Cody Koeninger
Did you try this example?
https://github.com/apache/spark/blob/master/examples/scala-2.10/src/main/scala/org/apache/spark/examples/streaming/KafkaWordCount.scala
I think you need to create a topic set with # partitions to consume.
Thanks
Best Regards
On Mon, Mar 30, 2015 at 9:35 PM, Nicolas
This line
at org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.close(
KafkaRDD.scala:158)
is the attempt to close the underlying kafka simple consumer.
We can add a null pointer check, but the underlying issue of the consumer
being null probably indicates a problem earlier. Do you see
Hello,
I'm using spark-streaming-kafka 1.3.0 with the new consumer Approach 2:
Direct Approach (No Receivers) (
http://spark.apache.org/docs/latest/streaming-kafka-integration.html). I'm
using the following code snippets :
// Create direct kafka stream with brokers and topics
val messages =