I have seen questions posted about this on SO and on this list but haven't
seen a response that addresses my issue.  I am trying to create a direct
stream connection to a kafka topic but it fails with Couldn't find leader
offsets for Set(...).  If I run a kafka consumer I can read the topic but
can't do it with spark.  Can someone tell me where I'm going wrong here?

Test topic info:
vagrant@broker1$ ./bin/kafka-topics.sh --describe --zookeeper 10.30.3.2:2181
--topic footopic
Topic:footopic  PartitionCount:1        ReplicationFactor:1     Configs:
        Topic: footopic Partition: 0    Leader: 0       Replicas: 0       Isr: 0

consuming from kafka:
vagrant@broker1$ bin/kafka-console-consumer.sh --zookeeper 10.30.3.2:2181
--from-beginning --topic footopic
this is a test
and so is this
goodbye

Attempting from spark:
spark-submit --class com.foo.Experiment --master local[*] --jars
/vagrant/spark-streaming-kafka-assembly_2.10-1.6.1.jar
/vagrant/spark-app-1.0-SNAPSHOT.jar 10.0.7.34:9092

...

Using kafkaparams: {auto.offset.reset=smallest,
metadata.broker.list=10.0.7.34:9092}
16/05/18 20:27:21 INFO utils.VerifiableProperties: Verifying properties
16/05/18 20:27:21 INFO utils.VerifiableProperties: Property
auto.offset.reset is overridden to smallest
16/05/18 20:27:21 INFO utils.VerifiableProperties: Property group.id is
overridden to 
16/05/18 20:27:21 INFO utils.VerifiableProperties: Property
zookeeper.connect is overridden to 
16/05/18 20:27:21 INFO consumer.SimpleConsumer: Reconnect due to socket
error: java.nio.channels.ClosedChannelException
Exception in thread "main" org.apache.spark.SparkException:
java.nio.channels.ClosedChannelException
org.apache.spark.SparkException: Couldn't find leader offsets for
Set([footopic,0])
...


Any help is appreciated.

Thanks,
ch.





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Couldn-t-find-leader-offsets-tp26978.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to