Repository: spark
Updated Branches:
  refs/heads/branch-1.2 64e0cbc73 -> 58b3aa692


Add a note for context termination for History server on Yarn

The history server on Yarn only shows completed jobs. This adds a note 
concerning the needed explicit context termination at the end of a spark job 
which is a best practice anyway.
Related to SPARK-2972 and SPARK-3458

Author: moussa taifi <[email protected]>

Closes #4721 from moutai/add-history-server-note-for-closing-the-spark-context 
and squashes the following commits:

9f5b6c3 [moussa taifi] Fix upper case typo for YARN
3ad3db4 [moussa taifi] Add context termination for History server on Yarn

(cherry picked from commit c871e2dae0182e914135560d14304242e1f97f7e)
Signed-off-by: Andrew Or <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/58b3aa69
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/58b3aa69
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/58b3aa69

Branch: refs/heads/branch-1.2
Commit: 58b3aa692b653c29c1f2ae5a9d50938e2e980cf8
Parents: 64e0cbc
Author: moussa taifi <[email protected]>
Authored: Thu Feb 26 14:19:43 2015 -0800
Committer: Andrew Or <[email protected]>
Committed: Thu Feb 26 14:20:43 2015 -0800

----------------------------------------------------------------------
 docs/monitoring.md | 2 ++
 1 file changed, 2 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/58b3aa69/docs/monitoring.md
----------------------------------------------------------------------
diff --git a/docs/monitoring.md b/docs/monitoring.md
index f32cdef..a4880ad 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -149,6 +149,8 @@ follows:
 Note that in all of these UIs, the tables are sortable by clicking their 
headers,
 making it easy to identify slow tasks, data skew, etc.
 
+Note that the history server only displays completed Spark jobs. One way to 
signal the completion of a Spark job is to stop the Spark Context explicitly 
(`sc.stop()`), or in Python using the `with SparkContext() as sc:` to handle 
the Spark Context setup and tear down, and still show the job history on the UI.
+
 # Metrics
 
 Spark has a configurable metrics system based on the 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to