So you have Kafka in your classpath in you Java application, where you are
creating the sparkContext with the spark standalone master URL, right?

The recommended way of submitting spark applications to any cluster is
using spark-submit. See
https://spark.apache.org/docs/latest/submitting-applications.html. This
takes care of sending all the libraries to the cluster workers so that they
can be found.

Please try that.

On Mon, Jun 22, 2015 at 11:50 PM, Murthy Chelankuri <kmurt...@gmail.com>
wrote:

> I am invoking it from the java application by creating the sparkcontext
>
> On Tue, Jun 23, 2015 at 12:17 PM, Tathagata Das <t...@databricks.com>
> wrote:
>
>> How are you adding that to the classpath? Through spark-submit or
>> otherwise?
>>
>> On Mon, Jun 22, 2015 at 5:02 PM, Murthy Chelankuri <kmurt...@gmail.com>
>> wrote:
>>
>>> Yes I have the producer in the class path. And I am using in standalone
>>> mode.
>>>
>>> Sent from my iPhone
>>>
>>> On 23-Jun-2015, at 3:31 am, Tathagata Das <t...@databricks.com> wrote:
>>>
>>> Do you have Kafka producer in your classpath? If so how are adding that
>>> library? Are you running on YARN, or Mesos or Standalone or local. These
>>> details will be very useful.
>>>
>>> On Mon, Jun 22, 2015 at 8:34 AM, Murthy Chelankuri <kmurt...@gmail.com>
>>> wrote:
>>>
>>>> I am using spark streaming. what i am trying to do is sending few
>>>> messages to some kafka topic. where its failing.
>>>>
>>>> java.lang.ClassNotFoundException: com.abc.mq.msg.ObjectEncoder
>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>     at java.lang.Class.forName0(Native Method)
>>>>     at java.lang.Class.forName(Class.java:264)
>>>>     at kafka.utils.Utils$.createObject(Utils.scala:438)
>>>>     at kafka.producer.Producer.<init>(Producer.scala:61)
>>>>
>>>> On Mon, Jun 22, 2015 at 8:24 PM, Murthy Chelankuri <kmurt...@gmail.com>
>>>> wrote:
>>>>
>>>>> I have been using the spark from the last 6 months with the version
>>>>> 1.2.0.
>>>>>
>>>>> I am trying to migrate to the 1.3.0 but the same problem i have
>>>>> written is not wokring.
>>>>>
>>>>> Its giving class not found error when i try to load some dependent
>>>>> jars from the main program.
>>>>>
>>>>> This use to work in 1.2.0 when set all the dependent jars array to the
>>>>> spark context but not working in 1.3.0
>>>>>
>>>>>
>>>>> Please help me how to resolve this.
>>>>>
>>>>>
>>>>> Thanks,
>>>>> Murthy Chelankuri
>>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to