Hi

I want to create multiple sparkContext in my application.
i read so many articles they suggest " usage of multiple contexts is
discouraged, since SPARK-2243 is still not resolved."
i want to know that Is spark 1.5.0 supported to create multiple contexts
without error ?
and if supported then are we need to set
"spark.driver.allowMultipleContexts" configuration parameter ?

Regards
Prateek



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/is-Multiple-Spark-Contexts-is-supported-in-spark-1-5-0-tp25568.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to