[
https://issues.apache.org/jira/browse/SPARK-754?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14212113#comment-14212113
]
Andrew Ash commented on SPARK-754:
----------------------------------
This is actually currently unsupported, and a ticket to make this possible is
being tracked at SPARK-2243
I'm going to close this ticket as a duplicate of that one, but please let me
know if you feel there are subtleties here that keep these from being a
duplicate.
Thanks for the bug report Erik!
Andrew
> Multiple Spark Contexts active in a single Spark Context
> --------------------------------------------------------
>
> Key: SPARK-754
> URL: https://issues.apache.org/jira/browse/SPARK-754
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 0.7.0
> Reporter: Erik James Freed
> Priority: Critical
>
> This may be no more than creating a unit test to ensure it can be done but it
> is not clear that one can instantiate multiple spark contexts within a single
> VM and use them concurrently (one thread in a context at a time).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]