What kind of cluster mode are you running on? You may need to specify the
jar through --jar, though we're working on making spark-submit
automatically add the provided jar on the class path so we don't run into
ClassNotFoundException as you have.

What is the command that you ran?


On Tue, May 6, 2014 at 6:27 PM, Tathagata Das
<tathagata.das1...@gmail.com>wrote:

> Doesnt the run-example script work for you? Also, are you on the latest
> commit of branch-1.0 ?
>
> TD
>
>
> On Mon, May 5, 2014 at 7:51 PM, Soumya Simanta 
> <soumya.sima...@gmail.com>wrote:
>
>>
>>
>> Yes, I'm struggling with a similar problem where my class are not found
>> on the worker nodes. I'm using 1.0.0_SNAPSHOT.  I would really appreciate
>> if someone can provide some documentation on the usage of spark-submit.
>>
>> Thanks
>>
>> > On May 5, 2014, at 10:24 PM, Stephen Boesch <java...@gmail.com> wrote:
>> >
>> >
>> > I have a spark streaming application that uses the external streaming
>> modules (e.g. kafka, mqtt, ..) as well.  It is not clear how to properly
>> invoke the spark-submit script: what are the ---driver-class-path and/or
>> -Dspark.executor.extraClassPath parameters required?
>> >
>> >  For reference, the following error is proving difficult to resolve:
>> >
>> > java.lang.ClassNotFoundException:
>> org.apache.spark.streaming.examples.StreamingExamples
>> >
>>
>
>

Reply via email to