Hi Forum,

Is it not possible to run multiple SparkContexts concurrently without
stopping the other one in the spark 1.3.0.
I have been trying this out and getting the below error.

Caused by: org.apache.spark.SparkException: Only one SparkContext may be
running in this JVM (see SPARK-2243). To ignore this error, set
spark.driver.allowMultipleContexts = true. The currently running
SparkContext was created at:

According to this, its not possible to create unless we specify the option
spark.driver.allowMultipleContexts = true.

So is there a way to create multiple concurrently running SparkContext in
same JVM or should we trigger Driver processes in different JVMs to do the
same?

Also please let me know where the option
'spark.driver.allowMultipleContexts' to be set? I have set it in
spark-env.sh SPARK_MASTER_OPTS but no luck.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-SparkContexts-in-same-Driver-JVM-tp20037.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to