This could be because of some subtle change in the classloaders used by
executors. I think there has been issues in the past with libraries that
use Class.forName to find classes by reflection. Because the executors load
classes dynamically using custom class loaders, libraries that use
Class.forName does not use the right class loader that has the custom class
loader with dynamically loaded classes.

One workaround is the add the relevant library in the spark conf
spark.executor.extraClasspath and see if it works. Make sure that the
library should be already present in the worker machines at the given path.
This should start the executor with the library already present in the
initial classpath and therefore present in the system classloader. Then
probably Class.forName would find it.

TD

On Tue, Jun 23, 2015 at 12:14 AM, Murthy Chelankuri <kmurt...@gmail.com>
wrote:

> yes , in spark standalone mode witht the master URL.
>
> Jar are copying to execeutor and the application is running fine but its
> failing at some point when kafka is trying to load the classes using some
> reflection mechanisims for loading the Encoder and Partitioner classes.
>
> Here are my finding so far on this issue.
>
> But in the driver app if any module is trying to load class using the
> class loader  (using some reflection ) its not able to find the class. This
> use to work in the 1.2.0 not sure why its not working with 1.3.0
>
> Is there any way can we  make the driver to use the spark executor
> classloader for loading the classes or some thing like that?
>
>
> On Tue, Jun 23, 2015 at 12:28 PM, Tathagata Das <t...@databricks.com>
> wrote:
>
>> So you have Kafka in your classpath in you Java application, where you
>> are creating the sparkContext with the spark standalone master URL, right?
>>
>> The recommended way of submitting spark applications to any cluster is
>> using spark-submit. See
>> https://spark.apache.org/docs/latest/submitting-applications.html. This
>> takes care of sending all the libraries to the cluster workers so that they
>> can be found.
>>
>> Please try that.
>>
>> On Mon, Jun 22, 2015 at 11:50 PM, Murthy Chelankuri <kmurt...@gmail.com>
>> wrote:
>>
>>> I am invoking it from the java application by creating the sparkcontext
>>>
>>> On Tue, Jun 23, 2015 at 12:17 PM, Tathagata Das <t...@databricks.com>
>>> wrote:
>>>
>>>> How are you adding that to the classpath? Through spark-submit or
>>>> otherwise?
>>>>
>>>> On Mon, Jun 22, 2015 at 5:02 PM, Murthy Chelankuri <kmurt...@gmail.com>
>>>> wrote:
>>>>
>>>>> Yes I have the producer in the class path. And I am using in
>>>>> standalone mode.
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>> On 23-Jun-2015, at 3:31 am, Tathagata Das <t...@databricks.com> wrote:
>>>>>
>>>>> Do you have Kafka producer in your classpath? If so how are adding
>>>>> that library? Are you running on YARN, or Mesos or Standalone or local.
>>>>> These details will be very useful.
>>>>>
>>>>> On Mon, Jun 22, 2015 at 8:34 AM, Murthy Chelankuri <kmurt...@gmail.com
>>>>> > wrote:
>>>>>
>>>>>> I am using spark streaming. what i am trying to do is sending few
>>>>>> messages to some kafka topic. where its failing.
>>>>>>
>>>>>> java.lang.ClassNotFoundException: com.abc.mq.msg.ObjectEncoder
>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>>>     at java.lang.Class.forName0(Native Method)
>>>>>>     at java.lang.Class.forName(Class.java:264)
>>>>>>     at kafka.utils.Utils$.createObject(Utils.scala:438)
>>>>>>     at kafka.producer.Producer.<init>(Producer.scala:61)
>>>>>>
>>>>>> On Mon, Jun 22, 2015 at 8:24 PM, Murthy Chelankuri <
>>>>>> kmurt...@gmail.com> wrote:
>>>>>>
>>>>>>> I have been using the spark from the last 6 months with the version
>>>>>>> 1.2.0.
>>>>>>>
>>>>>>> I am trying to migrate to the 1.3.0 but the same problem i have
>>>>>>> written is not wokring.
>>>>>>>
>>>>>>> Its giving class not found error when i try to load some dependent
>>>>>>> jars from the main program.
>>>>>>>
>>>>>>> This use to work in 1.2.0 when set all the dependent jars array to
>>>>>>> the spark context but not working in 1.3.0
>>>>>>>
>>>>>>>
>>>>>>> Please help me how to resolve this.
>>>>>>>
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Murthy Chelankuri
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to