iple SparkContexts in one JVM. Thanks!
>
> Mingyu
>
> From: Dmitriy Lyubimov
> Reply-To: "user@spark.incubator.apache.org" <
> user@spark.incubator.apache.org>
> Date: Wednesday, November 20, 2013 at 1:42 PM
> To: "user@spark.incubator.apache.org"
&
multiple SparkContexts in one JVM. Thanks!
Mingyu
From: Dmitriy Lyubimov
Reply-To: "user@spark.incubator.apache.org"
Date: Wednesday, November 20, 2013 at 1:42 PM
To: "user@spark.incubator.apache.org"
Subject: Re: Multiple SparkContexts in one JVM
Oh. i suppose if you
ngyu Kim wrote:
>>
>>> Hi all,
>>>
>>> I’ve been searching to find out the current status of the multiple
>>> SparkContext support in one JVM. I found
>>> https://groups.google.com/forum/#!topic/spark-developers/GLx8yunSj0A
>>> and
ple
>> SparkContext support in one JVM. I found
>> https://groups.google.com/forum/#!topic/spark-developers/GLx8yunSj0A and
>> https://groups.google.com/forum/#!topic/spark-users/cOYP96I668I.
>> According to the threads, I should be able to create multiple SparkContexts
>>
d be able to create multiple SparkContexts by setting the
port to “0” every time. However, Matei mentioned that SparkEnv should be part
of TaskContext rather than being thread local. Does this cause any problem with
running multiple SparkContexts in one JVM right now or is this just a clean-up