Thanks Prajwal.

I tried these options and they make no difference.

On Thu, Jul 7, 2016 at 12:20 PM Prajwal Tuladhar <p...@infynyxx.com> wrote:

> You can try to play with experimental flags [1] 
> `spark.executor.userClassPathFirst`
> and `spark.driver.userClassPathFirst`. But this can also potentially
> break other things (like: dependencies that Spark master required
> initializing overridden by Spark app and so on) so, you will need to verify.
>
> [1] https://spark.apache.org/docs/latest/configuration.html
>
> On Thu, Jul 7, 2016 at 4:05 PM, Chen Song <chen.song...@gmail.com> wrote:
>
>> Sorry to spam people who are not interested. Greatly appreciate it if
>> anyone who is familiar with this can share some insights.
>>
>> On Wed, Jul 6, 2016 at 2:28 PM Chen Song <chen.song...@gmail.com> wrote:
>>
>>> Hi
>>>
>>> I ran into problems to use class loader in Spark. In my code (run within
>>> executor), I explicitly load classes using the ContextClassLoader as below.
>>>
>>> Thread.currentThread().getContextClassLoader()
>>>
>>> The jar containing the classes to be loaded is added via the --jars
>>> option in spark-shell/spark-submit.
>>>
>>> I always get the class not found exception. However, it seems to work if
>>> I compile these classes in main jar for the job (the jar containing the
>>> main job class).
>>>
>>> I know Spark implements its own class loaders in a particular way. Is
>>> there a way to work around this? In other words, what is the proper way to
>>> programmatically load classes in other jars added via --jars in Spark?
>>>
>>>
>
>
> --
> --
> Cheers,
> Praj
>

Reply via email to