[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-14 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/19919 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-11 Thread srowen
Github user srowen commented on the issue: https://github.com/apache/spark/pull/19919 If the default is correctly described as 2, then I think there is nothing to do here and this should be closed. This change will cause other problems, at least. ---

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-08 Thread vanzin
Github user vanzin commented on the issue: https://github.com/apache/spark/pull/19919 Also `YarnSparkHadoopUtil.DEFAULT_NUMBER_EXECUTORS` which actually seems to be used. --- - To unsubscribe, e-mail:

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-08 Thread vanzin
Github user vanzin commented on the issue: https://github.com/apache/spark/pull/19919 `ApplicationMasterArguments.DEFAULT_NUMBER_EXECUTORS`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-07 Thread liu-zhaokun
Github user liu-zhaokun commented on the issue: https://github.com/apache/spark/pull/19919 @srowen If the default is not correct,I can fix it in this PR by the way. --- - To unsubscribe, e-mail:

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-07 Thread srowen
Github user srowen commented on the issue: https://github.com/apache/spark/pull/19919 Hm, I thought the default would appear in `.../config/package.scala`, but I don't see it. I'm not actually sure it defaults to 2 anywhere (?) . (@vanzin does this ring a bell?) I don't think these

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-07 Thread liu-zhaokun
Github user liu-zhaokun commented on the issue: https://github.com/apache/spark/pull/19919 @srowen Where does the default value "2" of spark.executor.instances used? --- - To unsubscribe, e-mail:

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-07 Thread srowen
Github user srowen commented on the issue: https://github.com/apache/spark/pull/19919 No, this change breaks the existing logic. These instances do not actually control the default number of instances, but instead are used to detect whether the value was set at all. With this change,

[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

2017-12-07 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/19919 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional