Repository: spark Updated Branches: refs/heads/branch-1.1 36f3c499f -> 2785210fa
Add a note for context termination for History server on Yarn The history server on Yarn only shows completed jobs. This adds a note concerning the needed explicit context termination at the end of a spark job which is a best practice anyway. Related to SPARK-2972 and SPARK-3458 Author: moussa taifi <[email protected]> Closes #4721 from moutai/add-history-server-note-for-closing-the-spark-context and squashes the following commits: 9f5b6c3 [moussa taifi] Fix upper case typo for YARN 3ad3db4 [moussa taifi] Add context termination for History server on Yarn (cherry picked from commit c871e2dae0182e914135560d14304242e1f97f7e) Signed-off-by: Andrew Or <[email protected]> Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/2785210f Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/2785210f Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/2785210f Branch: refs/heads/branch-1.1 Commit: 2785210fa8f47cfea07598119671d3c006741f39 Parents: 36f3c49 Author: moussa taifi <[email protected]> Authored: Thu Feb 26 14:19:43 2015 -0800 Committer: Andrew Or <[email protected]> Committed: Thu Feb 26 14:20:51 2015 -0800 ---------------------------------------------------------------------- docs/monitoring.md | 2 ++ 1 file changed, 2 insertions(+) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/2785210f/docs/monitoring.md ---------------------------------------------------------------------- diff --git a/docs/monitoring.md b/docs/monitoring.md index d07ec4a..c8bdc07 100644 --- a/docs/monitoring.md +++ b/docs/monitoring.md @@ -142,6 +142,8 @@ follows: Note that in all of these UIs, the tables are sortable by clicking their headers, making it easy to identify slow tasks, data skew, etc. +Note that the history server only displays completed Spark jobs. One way to signal the completion of a Spark job is to stop the Spark Context explicitly (`sc.stop()`), or in Python using the `with SparkContext() as sc:` to handle the Spark Context setup and tear down, and still show the job history on the UI. + # Metrics Spark has a configurable metrics system based on the --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
