GitHub user sharkdtu opened a pull request:
https://github.com/apache/spark/pull/20078
[SPARK-22900] [Spark-Streaming] Remove unnecessary restrict for streaming
dynamic allocation
## What changes were proposed in this pull request?
When i set the conf `spark.streaming.dynamicAllocation.enabled=true`, the
conf `num-executors` can not be set. As a result, it will allocate default 2
executors and all receivers will be run on this 2 executors, there may not be
redundant cpu cores for tasks. it will stuck all the time.
in my opinion, we should remove unnecessary restrict for streaming dynamic
allocation. we can set `num-executors` and
`spark.streaming.dynamicAllocation.enabled=true` together. when application
starts, each receiver will be run on an executor.
## How was this patch tested?
Manual test.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/sharkdtu/spark master
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/20078.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #20078
----
commit 6a7d07b7f135ed8ad079a1918fe3484757960df0
Author: sharkdtu <sharkdtu@...>
Date: 2017-12-25T13:13:16Z
remove unnecessary restrict for streaming dynamic allocation
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]