Hi all,

I¹ve been searching to find out the current status of the multiple
SparkContext support in one JVM. I found
https://groups.google.com/forum/#!topic/spark-developers/GLx8yunSj0A and
https://groups.google.com/forum/#!topic/spark-users/cOYP96I668I. According
to the threads, I should be able to create multiple SparkContexts by setting
the port to ³0² every time. However, Matei mentioned that SparkEnv should be
part of TaskContext rather than being thread local. Does this cause any
problem with running multiple SparkContexts in one JVM right now or is this
just a clean-up that needs to be done eventually? I¹m wondering if Spark
supports multiple SparkContexts as is right now or if there is anything I
should be careful about when creating multiple SparkContexts.

Thanks!

Mingyu


Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to