Github user tgravescs commented on the issue:
https://github.com/apache/spark/pull/19194
I agree with that its going to be a toss-up from the users perspective as
to which setting will be applied. But if the TaskSetManager is using the
active job properties and that is what its basing the # of tasks to run on then
we should match that same thing in the allocation manager so that we have the
right number of executors to match what the task set manager is trying to run.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]