On Thu, Apr 16, 2015 at 07:47:51PM +0100, Sean Owen wrote:
IIRC that was fixed already in 1.3
https://github.com/apache/spark/commit/b2047b55c5fc85de6b63276d8ab9610d2496e08b
From that commit:
+ private val minNumExecutors =
conf.getInt("spark.dynamicAllocation.minExecutors", 0)
...
+ if (maxNumExecutors == 0) {
+ throw new SparkException("spark.dynamicAllocation.maxExecutors cannot be 0!")
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org