GitHub user karth295 opened a pull request:
https://github.com/apache/spark/pull/19183
[SPARK-21960][Streaming] Spark Streaming Dynamic Allocation should respect
spark.executor.instances
## What changes were proposed in this pull request?
Removes check that `spark.executor.instances` is set to 0 when using
Streaming DRA.
## How was this patch tested?
Manual tests
My only concern with this PR is that `spark.executor.instances` (or the
actual initial number of executors that the cluster manager gives Spark) can be
outside of `spark.streaming.dynamicAllocation.minExecutors` to
`spark.streaming.dynamicAllocation.maxExecutors`. I don't see a good way around
that, because this code only runs after the SparkContext has been created.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/karth295/spark master
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/19183.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #19183
----
commit 4c9769e232a5d028f48235260bde682a1d3b059a
Author: Karthik Palaniappan <[email protected]>
Date: 2017-09-08T22:10:13Z
[SPARK-21960][Streaming] Spark Streaming Dynamic Allocation should respect
spark.executor.instances
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]