Hello,
I tested Spark Fair Scheduler and found that the scheduler did not work
well.
According to Spark doc, Fair Scheduler assigns tasks in a round-robin
fashion.
https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application
In my understanding, if there are 2 jobs, Fair Scheduler assigns a task
alternately.
But in my environment, a job sometimes obtains resources sequentially
and sometimes not.
Why does it not an exactly round-robin?
I am using Spark 2.3.2 and my test includes 2 jobs which has 5 tasks.
The execution mode is local with 1 core.
--
Regards,
Yuta
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org