As far as i can tell, the mesos back end would still not work correctly
with multiple SparkContexts.

However, if you are just after spark query concurrency, spark 0.8 seems to
be supporting concurrent (reentrant) requests to the same session
(SparkContext). One should also be able to use FAIR scheduler in this case
it seems (at least that's what i request). So i just sempahore the same
context while keeping pool of SparkContext's at 1.

See the doc folder for details.



On Wed, Nov 20, 2013 at 3:26 AM, Mingyu Kim <[email protected]> wrote:

> Hi all,
>
> I’ve been searching to find out the current status of the multiple
> SparkContext support in one JVM. I found
> https://groups.google.com/forum/#!topic/spark-developers/GLx8yunSj0A and
> https://groups.google.com/forum/#!topic/spark-users/cOYP96I668I.
> According to the threads, I should be able to create multiple SparkContexts
> by setting the port to “0” every time. However, Matei mentioned that
> SparkEnv should be part of TaskContext rather than being thread local. Does
> this cause any problem with running multiple SparkContexts in one JVM right
> now or is this just a clean-up that needs to be done eventually? I’m
> wondering if Spark supports multiple SparkContexts as is right now or if
> there is anything I should be careful about when creating multiple
> SparkContexts.
>
> Thanks!
>
> Mingyu
>

Reply via email to