Repository: spark Updated Branches: refs/heads/master 02ec058ef -> 8cd6eea62
add Sphinx as a dependency of building docs Author: Davies Liu <dav...@databricks.com> Closes #3388 from davies/doc_readme and squashes the following commits: daa1482 [Davies Liu] add Sphinx dependency Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/8cd6eea6 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/8cd6eea6 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/8cd6eea6 Branch: refs/heads/master Commit: 8cd6eea6298fc8e811dece38c2875e94ff863948 Parents: 02ec058 Author: Davies Liu <dav...@databricks.com> Authored: Thu Nov 20 19:12:45 2014 -0800 Committer: Patrick Wendell <pwend...@gmail.com> Committed: Thu Nov 20 19:13:16 2014 -0800 ---------------------------------------------------------------------- docs/README.md | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/8cd6eea6/docs/README.md ---------------------------------------------------------------------- diff --git a/docs/README.md b/docs/README.md index d2d58e4..1194840 100644 --- a/docs/README.md +++ b/docs/README.md @@ -43,7 +43,7 @@ You can modify the default Jekyll build as follows: ## Pygments We also use pygments (http://pygments.org) for syntax highlighting in documentation markdown pages, -so you will also need to install that (it requires Python) by running `sudo easy_install Pygments`. +so you will also need to install that (it requires Python) by running `sudo pip install Pygments`. To mark a block of code in your markdown to be syntax highlighted by jekyll during the compile phase, use the following sytax: @@ -53,6 +53,11 @@ phase, use the following sytax: // supported languages too. {% endhighlight %} +## Sphinx + +We use Sphinx to generate Python API docs, so you will need to install it by running +`sudo pip install sphinx`. + ## API Docs (Scaladoc and Sphinx) You can build just the Spark scaladoc by running `sbt/sbt doc` from the SPARK_PROJECT_ROOT directory. --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org