Or you could build an uber jar ( you could google that )

https://eradiating.wordpress.com/2015/02/15/getting-spark-streaming-on-kafka-to-work/

--- Original Message ---

From: "Akhil Das" <ak...@sigmoidanalytics.com>
Sent: April 4, 2015 11:52 PM
To: "Priya Ch" <learnings.chitt...@gmail.com>
Cc: user@spark.apache.org, "dev" <d...@spark.apache.org>
Subject: Re: Spark streaming with Kafka- couldnt find KafkaUtils

How are you submitting the application? Use a standard build tool like
maven or sbt to build your project, it will download all the dependency
jars, when you submit your application (if you are using spark-submit, then
use --jars option to add those jars which are causing
classNotFoundException). If you are running as a standalone application
without using spark-submit, then while creating the SparkContext, use
sc.addJar() to add those dependency jars.

For Kafka streaming, when you use sbt, these will be jars that are required:

    
sc.addJar("/root/.ivy2/cache/org.apache.spark/spark-streaming-kafka_2.10/jars/spark-streaming-kafka_2.10-1.1.0.jar")
   
sc.addJar("/root/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar")
   
sc.addJar("/root/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.0.jar")
   sc.addJar("/root/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar")




Thanks
Best Regards

On Sun, Apr 5, 2015 at 12:00 PM, Priya Ch <learnings.chitt...@gmail.com>
wrote:

> Hi All,
>
>   I configured Kafka  cluster on a  single node and I have streaming
> application which reads data from kafka topic using KafkaUtils. When I
> execute the code in local mode from the IDE, the application runs fine.
>
> But when I submit the same to spark cluster in standalone mode, I end up
> with the following exception:
> java.lang.ClassNotFoundException:
> org/apache/spark/streaming/kafka/KafkaUtils.
>
> I am using spark-1.2.1 version. when i checked the source files of
> streaming, the source files related to kafka are missing. Are these not
> included in spark-1.3.0 and spark-1.2.1 versions ?
>
> Have to manually include these ??
>
> Regards,
> Padma Ch
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to