OK.

I think it is little unusual use pattern, but it should work.

As I said before, if you want those Spark applications to share cluster
resources, proper configs is needed for Spark.

If you submit the main driver and all other Spark applications in client
mode under yarn, you should make sure the node running the driver has enough
resources to run them.

I am not sure if you can use `SparkLauncher` to submit them in different
mode, e.g., main driver in client mode, others in cluster mode. Worth
trying.





-----
Liang-Chi Hsieh | @viirya 
Spark Technology Center 
http://www.spark.tc/ 
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Launching-multiple-spark-jobs-within-a-main-spark-job-tp20311p20315.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to