GitHub user liu-zhaokun opened a pull request:
https://github.com/apache/spark/pull/19919
[SPARK-22727] spark.executor.instances's default value should be 2
[https://issues.apache.org/jira/browse/SPARK-22727](https://issues.apache.org/jira/browse/SPARK-22727)
## What changes were proposed in this pull request?
when I run a application on yarn,I don't set the value of
spark.executor.instances,so I think it's default value should as same as the
running-on-yarn.md said is 2.But the log of driver logs
"spark.executor.instances less than spark.dynamicAllocation.minExecutors is
invalid, ignoring its setting, please update your configs.",so I know the
default value of this configuration isn't 2.So I think we should fix it.
## How was this patch tested?
(Please explain how this patch was tested. E.g. unit tests, integration
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise,
remove this)
Please review http://spark.apache.org/contributing.html before opening a
pull request.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/liu-zhaokun/spark master1207
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/19919.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #19919
commit 9c3cd44c0cf0501023a8adab262a14f257acf012
Author: liuzhaokun
Date: 2017-12-07T09:10:29Z
[SPARK-22727] spark.executor.instances's default value should be 2
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org