Re: spark classloader question

2016-07-07 Thread Chen Song
Thanks Prajwal.

I tried these options and they make no difference.

On Thu, Jul 7, 2016 at 12:20 PM Prajwal Tuladhar  wrote:

> You can try to play with experimental flags [1] 
> `spark.executor.userClassPathFirst`
> and `spark.driver.userClassPathFirst`. But this can also potentially
> break other things (like: dependencies that Spark master required
> initializing overridden by Spark app and so on) so, you will need to verify.
>
> [1] https://spark.apache.org/docs/latest/configuration.html
>
> On Thu, Jul 7, 2016 at 4:05 PM, Chen Song  wrote:
>
>> Sorry to spam people who are not interested. Greatly appreciate it if
>> anyone who is familiar with this can share some insights.
>>
>> On Wed, Jul 6, 2016 at 2:28 PM Chen Song  wrote:
>>
>>> Hi
>>>
>>> I ran into problems to use class loader in Spark. In my code (run within
>>> executor), I explicitly load classes using the ContextClassLoader as below.
>>>
>>> Thread.currentThread().getContextClassLoader()
>>>
>>> The jar containing the classes to be loaded is added via the --jars
>>> option in spark-shell/spark-submit.
>>>
>>> I always get the class not found exception. However, it seems to work if
>>> I compile these classes in main jar for the job (the jar containing the
>>> main job class).
>>>
>>> I know Spark implements its own class loaders in a particular way. Is
>>> there a way to work around this? In other words, what is the proper way to
>>> programmatically load classes in other jars added via --jars in Spark?
>>>
>>>
>
>
> --
> --
> Cheers,
> Praj
>


Re: spark classloader question

2016-07-07 Thread Chen Song
Thanks Marco

The code snippet has something like below.

ClassLoader cl = Thread.currentThread().getContextClassLoader();
String packagePath = "com.xxx.xxx";
final Enumeration resources = cl.getResources(packagePath);

So resources collection is always empty, indicating no classes are loaded.

As I mentioned in my original email, it works when I include those classes
in the fat jar. Our use case is that each team will create their own jar
including their own protobuf schema classes. I cannot really create a fat
jar including every class in every environment.

Chen


On Thu, Jul 7, 2016 at 12:18 PM Marco Mistroni  wrote:

> Hi Chen
>  pls post
> 1 . snippet code
> 2. exception
>
> any particular reason why you need to load classes in other jars
> programmatically?
>
> Have you tried to build a fat jar with all the dependencies ?
>
> hth
> marco
>
> On Thu, Jul 7, 2016 at 5:05 PM, Chen Song  wrote:
>
>> Sorry to spam people who are not interested. Greatly appreciate it if
>> anyone who is familiar with this can share some insights.
>>
>> On Wed, Jul 6, 2016 at 2:28 PM Chen Song  wrote:
>>
>>> Hi
>>>
>>> I ran into problems to use class loader in Spark. In my code (run within
>>> executor), I explicitly load classes using the ContextClassLoader as below.
>>>
>>> Thread.currentThread().getContextClassLoader()
>>>
>>> The jar containing the classes to be loaded is added via the --jars
>>> option in spark-shell/spark-submit.
>>>
>>> I always get the class not found exception. However, it seems to work if
>>> I compile these classes in main jar for the job (the jar containing the
>>> main job class).
>>>
>>> I know Spark implements its own class loaders in a particular way. Is
>>> there a way to work around this? In other words, what is the proper way to
>>> programmatically load classes in other jars added via --jars in Spark?
>>>
>>>
>


Re: spark classloader question

2016-07-07 Thread Prajwal Tuladhar
You can try to play with experimental flags [1]
`spark.executor.userClassPathFirst`
and `spark.driver.userClassPathFirst`. But this can also potentially break
other things (like: dependencies that Spark master required initializing
overridden by Spark app and so on) so, you will need to verify.

[1] https://spark.apache.org/docs/latest/configuration.html

On Thu, Jul 7, 2016 at 4:05 PM, Chen Song  wrote:

> Sorry to spam people who are not interested. Greatly appreciate it if
> anyone who is familiar with this can share some insights.
>
> On Wed, Jul 6, 2016 at 2:28 PM Chen Song  wrote:
>
>> Hi
>>
>> I ran into problems to use class loader in Spark. In my code (run within
>> executor), I explicitly load classes using the ContextClassLoader as below.
>>
>> Thread.currentThread().getContextClassLoader()
>>
>> The jar containing the classes to be loaded is added via the --jars
>> option in spark-shell/spark-submit.
>>
>> I always get the class not found exception. However, it seems to work if
>> I compile these classes in main jar for the job (the jar containing the
>> main job class).
>>
>> I know Spark implements its own class loaders in a particular way. Is
>> there a way to work around this? In other words, what is the proper way to
>> programmatically load classes in other jars added via --jars in Spark?
>>
>>


-- 
--
Cheers,
Praj


Re: spark classloader question

2016-07-07 Thread Marco Mistroni
Hi Chen
 pls post
1 . snippet code
2. exception

any particular reason why you need to load classes in other jars
programmatically?

Have you tried to build a fat jar with all the dependencies ?

hth
marco

On Thu, Jul 7, 2016 at 5:05 PM, Chen Song  wrote:

> Sorry to spam people who are not interested. Greatly appreciate it if
> anyone who is familiar with this can share some insights.
>
> On Wed, Jul 6, 2016 at 2:28 PM Chen Song  wrote:
>
>> Hi
>>
>> I ran into problems to use class loader in Spark. In my code (run within
>> executor), I explicitly load classes using the ContextClassLoader as below.
>>
>> Thread.currentThread().getContextClassLoader()
>>
>> The jar containing the classes to be loaded is added via the --jars
>> option in spark-shell/spark-submit.
>>
>> I always get the class not found exception. However, it seems to work if
>> I compile these classes in main jar for the job (the jar containing the
>> main job class).
>>
>> I know Spark implements its own class loaders in a particular way. Is
>> there a way to work around this? In other words, what is the proper way to
>> programmatically load classes in other jars added via --jars in Spark?
>>
>>


Re: spark classloader question

2016-07-07 Thread Chen Song
Sorry to spam people who are not interested. Greatly appreciate it if
anyone who is familiar with this can share some insights.

On Wed, Jul 6, 2016 at 2:28 PM Chen Song  wrote:

> Hi
>
> I ran into problems to use class loader in Spark. In my code (run within
> executor), I explicitly load classes using the ContextClassLoader as below.
>
> Thread.currentThread().getContextClassLoader()
>
> The jar containing the classes to be loaded is added via the --jars option
> in spark-shell/spark-submit.
>
> I always get the class not found exception. However, it seems to work if I
> compile these classes in main jar for the job (the jar containing the main
> job class).
>
> I know Spark implements its own class loaders in a particular way. Is
> there a way to work around this? In other words, what is the proper way to
> programmatically load classes in other jars added via --jars in Spark?
>
>