Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3393#issuecomment-64141499
Yes - let's just fix the bug with this patch and we can punt improving this
until later.
The bug with this patch is that if there is an offer that is not used for a
node that is already in `slavesWithExectors`, then this patch will not decline
it. One simple way to fix this is to keep track of nodes that were assigned an
offer for that round, then decline offers that were not included in that set.
This assumes that we don't get multiple offers for a given host. However,
I'm pretty sure the old code assumed that as well.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]