Hi Anton,
Spark Pools / Spark Fair Scheduler is scheduling the tasks within a
Spark Job. Each Spark job will have multiple stages and each stage will
have multiple tasks.
This is different from YARN Fair Scheduler which schedules the jobs
submitted to YARN Cluster. Spark Pools within a
Hi everyone,
Spark supports in application, job concurrency execution by using pools and
Spark's Fair scheduler (different than Yarn's Fair scheduler).
link:
https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application
Is this feature supported when Yarn is used as a