Hi Anton,

That could solve some of the issues (I've played with that a little
bit). But there are still some areas where this would be sub-optimal,
because Spark still uses system properties in some places and those
are global, not per-class loader.

(SparkSubmit is the biggest offender here, but if you're doing
multiple contexts in the same VM you're probably not using
SparkSubmit. The rest of the code is a lot better but I wouldn't count
on it being 100% safe.)


On Wed, Dec 17, 2014 at 6:23 PM, Anton Brazhnyk
<anton.brazh...@genesys.com> wrote:
> Greetings,
>
>
>
> First comment on the issue says that reason for non-supporting of multiple
> contexts is
> “There are numerous assumptions in the code base that uses a shared cache or
> thread local variables or some global identifiers
> which prevent us from using multiple SparkContext's.”
>
>
>
> May it be worked around by creating those context in several classloaders
> with their own copies of Spark classes?
>
>
>
> Thanks,
>
> Anton



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to