Hi I have Spark driver program which has one loop which iterates for around 2000 times and for two thousands times it executes jobs in YARN. Since loop will do the job serially I want to introduce parallelism If I create 2000 tasks/runnable/callable in my Spark driver program will it get executed in parallel in YARN cluster. Please guide it would be great if you can share example code where we can run multiple threads in driver program. I am new to Spark thanks in advance
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Example-code-to-spawn-multiple-threads-in-driver-program-tp24290.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org