add --jars <LOCATION>/spark-streaming-kafka_2.10-1.5.1.jar (may need to download the jar file or any newer version)
to spark-shell. I also have spark-streaming-kafka-assembly_2.10-1.6.1.jar as well on --jar list HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On 13 October 2016 at 09:24, JayKay <juliankeppel1...@gmail.com> wrote: > I want to work with the Kafka integration for structured streaming. I use > Spark version 2.0.0. and I start the spark-shell with: > > spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.0 > > As described here: > https://github.com/apache/spark/blob/master/docs/ > structured-streaming-kafka-integration.md > > But I get a unresolved dependency error ("unresolved dependency: > org.apache.spark#spark-sql-kafka-0-10_2.11;2.0.0: not found"). So it seems > not to be available via maven or spark-packages. > > How can I accesss this package? Or am I doing something wrong/missing? > > Thank you for you help. > > > > -- > View this message in context: http://apache-spark-user-list. > 1001560.n3.nabble.com/Want-to-test-spark-sql-kafka-but-get- > unresolved-dependency-error-tp27891.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >