Repository: spark
Updated Branches:
  refs/heads/branch-1.0 9fbb22c20 -> 71ad53f81


fix broken in link in python docs

Author: Andy Konwinski <[email protected]>

Closes #650 from andyk/python-docs-link-fix and squashes the following commits:

a1f9d51 [Andy Konwinski] fix broken in link in python docs
(cherry picked from commit c05d11bb307eaba40c5669da2d374c28debaa55a)

Signed-off-by: Patrick Wendell <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/71ad53f8
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/71ad53f8
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/71ad53f8

Branch: refs/heads/branch-1.0
Commit: 71ad53f8176fd9411edf04188a3fe0264777a781
Parents: 9fbb22c
Author: Andy Konwinski <[email protected]>
Authored: Sat May 10 12:46:51 2014 -0700
Committer: Patrick Wendell <[email protected]>
Committed: Sat May 10 12:47:18 2014 -0700

----------------------------------------------------------------------
 docs/python-programming-guide.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/71ad53f8/docs/python-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md
index 6813963..39fb5f0 100644
--- a/docs/python-programming-guide.md
+++ b/docs/python-programming-guide.md
@@ -45,7 +45,7 @@ errors = logData.filter(is_error)
 
 PySpark will automatically ship these functions to executors, along with any 
objects that they reference.
 Instances of classes will be serialized and shipped to executors by PySpark, 
but classes themselves cannot be automatically distributed to executors.
-The [Standalone Use](#standalone-use) section describes how to ship code 
dependencies to executors.
+The [Standalone Use](#standalone-programs) section describes how to ship code 
dependencies to executors.
 
 In addition, PySpark fully supports interactive use---simply run 
`./bin/pyspark` to launch an interactive shell.
 

Reply via email to