Re: Multiple SparkContexts in one JVM

2013-11-20 Thread Dmitriy Lyubimov
iple SparkContexts in one JVM. Thanks! > > Mingyu > > From: Dmitriy Lyubimov > Reply-To: "user@spark.incubator.apache.org" < > user@spark.incubator.apache.org> > Date: Wednesday, November 20, 2013 at 1:42 PM > To: "user@spark.incubator.apache.org" &

Re: Multiple SparkContexts in one JVM

2013-11-20 Thread Mingyu Kim
multiple SparkContexts in one JVM. Thanks! Mingyu From: Dmitriy Lyubimov Reply-To: "user@spark.incubator.apache.org" Date: Wednesday, November 20, 2013 at 1:42 PM To: "user@spark.incubator.apache.org" Subject: Re: Multiple SparkContexts in one JVM Oh. i suppose if you

Re: Multiple SparkContexts in one JVM

2013-11-20 Thread Dmitriy Lyubimov
ngyu Kim wrote: >> >>> Hi all, >>> >>> I’ve been searching to find out the current status of the multiple >>> SparkContext support in one JVM. I found >>> https://groups.google.com/forum/#!topic/spark-developers/GLx8yunSj0A >>> and

Re: Multiple SparkContexts in one JVM

2013-11-20 Thread Dmitriy Lyubimov
ple >> SparkContext support in one JVM. I found >> https://groups.google.com/forum/#!topic/spark-developers/GLx8yunSj0A and >> https://groups.google.com/forum/#!topic/spark-users/cOYP96I668I. >> According to the threads, I should be able to create multiple SparkContexts >>

Re: Multiple SparkContexts in one JVM

2013-11-20 Thread Matt Cheah
d be able to create multiple SparkContexts by setting the port to “0” every time. However, Matei mentioned that SparkEnv should be part of TaskContext rather than being thread local. Does this cause any problem with running multiple SparkContexts in one JVM right now or is this just a clean-up