Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-62332453
I'm going to close this issue as wontfix.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/1465
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-56902717
If this is being used for testing, I don't see a compelling reason to
adding a config over using the constructor.
---
If your project is set up for it, you can reply
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-55338661
@kbzod Why do we need a separate config for the local case? I think the
correct solution is to use the same config, but set a different default value
for local mode.
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-55338673
add to whitelist
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-54694585
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user kbzod commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-53260871
@JoshRosen You are right, the `local[N, maxfailures]` mechanism works
already, but the filer of
[SPARK-2083](https://issues.apache.org/jira/browse/SPARK-2083) stated that
Github user roji commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-51718343
+1
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-51719921
I think there's already a mechanism to set this by using `local[N,
maxFailures]` to create your SparkContext:
```scala
// Regular expression for local[N,
Github user kbzod commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15245692
--- Diff: docs/configuration.md ---
@@ -599,6 +599,15 @@ Apart from these, the following properties are also
available, and may be useful
td
Github user kbzod commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15245710
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1477,7 +1478,8 @@ object SparkContext extends Logging {
def localCpuCount
Github user kbzod commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15245702
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1463,12 +1463,13 @@ object SparkContext extends Logging {
// Regular
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15148111
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1463,12 +1463,13 @@ object SparkContext extends Logging {
// Regular
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15148112
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1477,7 +1478,8 @@ object SparkContext extends Logging {
def localCpuCount
Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15148114
--- Diff: docs/configuration.md ---
@@ -599,6 +599,15 @@ Apart from these, the following properties are also
available, and may be useful
td
GitHub user kbzod opened a pull request:
https://github.com/apache/spark/pull/1465
SPARK-2083 Add support for spark.local.maxFailures configuration property
The logic in `SparkContext` for creating a new task scheduler now looks for
a spark.local.maxFailures property to specify the
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-49335741
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
17 matches
Mail list logo