Hi,

I'm trying to make spark work on multithreads java application.
What i'm trying to do is,

- Create a Single SparkContext
- Create Multiple SparkILoop and SparkIMain
- Inject created SparkContext into SparkIMain interpreter.

Thread is created by every user request and take a SparkILoop and interpret
some code.

My problem is
 - If a thread take first SparkILoop instance, than everything works fine.
 - If a thread take other SparkILoop instance, Spark can not find closure /
case classes that i defined inside of interpreter.

I read some previous topic and I think it's related with SparkEnv and
ClosureCleaner. tried SparkEnv.set(env) with the env i can get right after
SparkContext created. i not still no class found exception.

Can anyone give me some idea?
Thanks.


Best,
moon

Reply via email to