Repository: spark
Updated Branches:
  refs/heads/branch-1.0 9e90c464d -> 170b09d94


Some clean up in build/docs

(a) Deleted an outdated line from the docs
(b) Removed a work around that is no longer necessary given the mesos version 
bump.

Author: Patrick Wendell <pwend...@gmail.com>

Closes #382 from pwendell/maven-clean and squashes the following commits:

f0447fa [Patrick Wendell] Minor doc clean-up
(cherry picked from commit 98225a6effd077a1b97c7e485d45ffd89b2c5b7f)

Signed-off-by: Patrick Wendell <pwend...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/170b09d9
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/170b09d9
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/170b09d9

Branch: refs/heads/branch-1.0
Commit: 170b09d943018cd62586e8f851c5d9dce665c767
Parents: 9e90c46
Author: Patrick Wendell <pwend...@gmail.com>
Authored: Fri Apr 11 10:45:27 2014 -0700
Committer: Patrick Wendell <pwend...@gmail.com>
Committed: Fri Apr 11 10:45:39 2014 -0700

----------------------------------------------------------------------
 docs/index.md | 2 --
 1 file changed, 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/170b09d9/docs/index.md
----------------------------------------------------------------------
diff --git a/docs/index.md b/docs/index.md
index 7a13fa9..89ec5b0 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -67,8 +67,6 @@ In addition, if you wish to run Spark on 
[YARN](running-on-yarn.html), set
 
 Note that on Windows, you need to set the environment variables on separate 
lines, e.g., `set SPARK_HADOOP_VERSION=1.2.1`.
 
-For this version of Spark (0.8.1) Hadoop 2.2.x (or newer) users will have to 
build Spark and publish it locally. See [Launching Spark on 
YARN](running-on-yarn.html). This is needed because Hadoop 2.2 has non 
backwards compatible API changes.
-
 # Where to Go from Here
 
 **Programming guides:**

Reply via email to