dongjoon-hyun commented on code in PR #45982:
URL: https://github.com/apache/spark/pull/45982#discussion_r1560434354
##
docs/job-scheduling.md:
##
@@ -92,6 +96,8 @@ In standalone mode, simply start your workers with
`spark.shuffle.service.enable
In YARN mode, follow the
beliefer commented on code in PR #45982:
URL: https://github.com/apache/spark/pull/45982#discussion_r1560363139
##
docs/job-scheduling.md:
##
@@ -53,7 +53,11 @@ Resource allocation can be configured as follows, based on
the cluster type:
on the cluster
beliefer commented on PR #45982:
URL: https://github.com/apache/spark/pull/45982#issuecomment-2048826987
> Do we support scheduling jobs across applications? It's odd to me.
This section is about scheduling across applications.
`Scheduling Within an Application` section is related
dongjoon-hyun commented on code in PR #45982:
URL: https://github.com/apache/spark/pull/45982#discussion_r1559997742
##
docs/job-scheduling.md:
##
@@ -53,7 +53,11 @@ Resource allocation can be configured as follows, based on
the cluster type:
on the cluster
yaooqinn commented on PR #45982:
URL: https://github.com/apache/spark/pull/45982#issuecomment-2047422813
Nit: Use K8s instead of K8S, the former is the official abbreviation
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
yaooqinn commented on PR #45982:
URL: https://github.com/apache/spark/pull/45982#issuecomment-2047412733
I didn't even notice this page or section. The dropdown from the Navi Bar is
enough for me.
Do we support scheduling jobs across applications? It's odd to me.
--
This is an