Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/5385#issuecomment-113338014
Well, there are at least two other important considerations:
- We should use the batch queue instead of the task queue
- We should never kill executors with receivers
The first is important because dynamic allocation currently doesn't really
do anything in most streaming workloads. The second is crucial because we don't
want dynamic allocation to disrupt the basic function of the application.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]