tgravescs commented on pull request #29332:
URL: https://github.com/apache/spark/pull/29332#issuecomment-670010625
Oh right, if you didn't specify the executor cores in standalone mode - you
get them all by default. Which honestly causes lots of issues - I filed a jira
for this for other things. that is left up to the user to properly configure
their cluster and job.
There are other things in spark 3.0 that expect cores to be the limiting
resource. Dynamic allocation is a big one, which I know you don't care about,
but I'm pretty sure there are other things throughout the code that does as
well.
We should remove the isDynamicAllocationCheck here so that the warning gets
printed for standalone as well:
https://github.com/apache/spark/blob/branch-3.0/core/src/main/scala/org/apache/spark/SparkContext.scala#L2836
If you want to go through and make sure everything is updated to schedule
based on resources, I'm fine with it but I'm pretty sure needs to be more then
this.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]