Hi All,
When running concurrent Spark Jobs on YARN (Spark-1.5.2) which share a
single Spark Context, the jobs take more time to complete comparing with
when they ran with different Spark Context.
The spark jobs are submitted on different threads.
Test Case:
A. 3 spark jobs submitted serially
B. 3 spark jobs submitted concurrently and with different SparkContext
C. 3 spark jobs submitted concurrently and with same Spark Context
D. 3 spark jobs submitted concurrently and with same Spark Context and
tripling the resources.
A and B takes equal time, But C and D are taking 2-3 times longer than A,
which shows concurrency does not improve with shared Spark Context. [Spark
Job Server]
Thanks,
Prabhu Joseph