Yes, although once you have multiple ClassLoaders, you are operating
as if in multiple JVMs for most intents and purposes. I think the
request for this kind of functionality comes from use cases where
multiple ClassLoaders wouldn't work, like, wanting to have one app (in
one ClassLoader) managing multiple contexts.

On Thu, Dec 18, 2014 at 2:23 AM, Anton Brazhnyk
<anton.brazh...@genesys.com> wrote:
> Greetings,
>
>
>
> First comment on the issue says that reason for non-supporting of multiple
> contexts is
> “There are numerous assumptions in the code base that uses a shared cache or
> thread local variables or some global identifiers
> which prevent us from using multiple SparkContext's.”
>
>
>
> May it be worked around by creating those context in several classloaders
> with their own copies of Spark classes?
>
>
>
> Thanks,
>
> Anton

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to