Repository: spark
Updated Branches:
  refs/heads/master c9d612f82 -> 3df2d9314


[MINOR][DOC] Document local[*,F] master modes

## What changes were proposed in this pull request?

core/src/main/scala/org/apache/spark/SparkContext.scala contains 
LOCAL_N_FAILURES_REGEX master mode, but this was never documented, so do so.

## How was this patch tested?

By using the Github Markdown preview feature.

Author: Maurus Cuelenaere <[email protected]>

Closes #16562 from mcuelenaere/patch-1.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/3df2d931
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/3df2d931
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/3df2d931

Branch: refs/heads/master
Commit: 3df2d93146a4609c1c4a25b635a898fe5c3be9b6
Parents: c9d612f
Author: Maurus Cuelenaere <[email protected]>
Authored: Sun Jan 15 11:14:50 2017 +0000
Committer: Sean Owen <[email protected]>
Committed: Sun Jan 15 11:14:50 2017 +0000

----------------------------------------------------------------------
 docs/submitting-applications.md | 2 ++
 1 file changed, 2 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/3df2d931/docs/submitting-applications.md
----------------------------------------------------------------------
diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md
index b738194..b8b4cc3 100644
--- a/docs/submitting-applications.md
+++ b/docs/submitting-applications.md
@@ -137,7 +137,9 @@ The master URL passed to Spark can be in one of the 
following formats:
 <tr><th>Master URL</th><th>Meaning</th></tr>
 <tr><td> <code>local</code> </td><td> Run Spark locally with one worker thread 
(i.e. no parallelism at all). </td></tr>
 <tr><td> <code>local[K]</code> </td><td> Run Spark locally with K worker 
threads (ideally, set this to the number of cores on your machine). </td></tr>
+<tr><td> <code>local[K,F]</code> </td><td> Run Spark locally with K worker 
threads and F maxFailures (see <a 
href="configuration.html#scheduling">spark.task.maxFailures</a> for an 
explanation of this variable) </td></tr>
 <tr><td> <code>local[*]</code> </td><td> Run Spark locally with as many worker 
threads as logical cores on your machine.</td></tr>
+<tr><td> <code>local[*,F]</code> </td><td> Run Spark locally with as many 
worker threads as logical cores on your machine and F maxFailures.</td></tr>
 <tr><td> <code>spark://HOST:PORT</code> </td><td> Connect to the given <a 
href="spark-standalone.html">Spark standalone
         cluster</a> master. The port must be whichever one your master is 
configured to use, which is 7077 by default.
 </td></tr>


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to