Greetings,

First comment on the issue says that reason for non-supporting of multiple 
contexts is
"There are numerous assumptions in the code base that uses a shared cache or 
thread local variables or some global identifiers
which prevent us from using multiple SparkContext's."

May it be worked around by creating those context in several classloaders with 
their own copies of Spark classes?

Thanks,
Anton

Reply via email to