Repository: spark
Updated Branches:
  refs/heads/master d6e4c5917 -> 001acc446


[SPARK-4177][Doc]update build doc since JDBC/CLI support hive 13 now

Author: wangfei <[email protected]>

Closes #3042 from scwf/patch-9 and squashes the following commits:

3784ed1 [wangfei] remove 'TODO'
1891553 [wangfei] update build doc since JDBC/CLI support hive 13


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/001acc44
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/001acc44
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/001acc44

Branch: refs/heads/master
Commit: 001acc446345ccb1e494af9ff1d16dd65db8034e
Parents: d6e4c59
Author: wangfei <[email protected]>
Authored: Sun Nov 2 22:02:05 2014 -0800
Committer: Patrick Wendell <[email protected]>
Committed: Sun Nov 2 22:02:05 2014 -0800

----------------------------------------------------------------------
 docs/building-spark.md | 17 +++++++----------
 1 file changed, 7 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/001acc44/docs/building-spark.md
----------------------------------------------------------------------
diff --git a/docs/building-spark.md b/docs/building-spark.md
index 4cc0b1f..238ddae 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -99,14 +99,11 @@ mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests 
clean package
 mvn -Pyarn-alpha -Phadoop-2.3 -Dhadoop.version=2.3.0 -Dyarn.version=0.23.7 
-DskipTests clean package
 {% endhighlight %}
 
-<!--- TODO: Update this when Hive 0.13 JDBC is added -->
-
 # Building With Hive and JDBC Support
 To enable Hive integration for Spark SQL along with its JDBC server and CLI,
 add the `-Phive` profile to your existing build options. By default Spark
 will build with Hive 0.13.1 bindings. You can also build for Hive 0.12.0 using
-the `-Phive-0.12.0` profile. NOTE: currently the JDBC server is only
-supported for Hive 0.12.0.
+the `-Phive-0.12.0` profile.
 {% highlight bash %}
 # Apache Hadoop 2.4.X with Hive 13 support
 mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean package
@@ -121,8 +118,8 @@ Tests are run by default via the [ScalaTest Maven 
plugin](http://www.scalatest.o
 
 Some of the tests require Spark to be packaged first, so always run `mvn 
package` with `-DskipTests` the first time.  The following is an example of a 
correct (build, test) sequence:
 
-    mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-0.12.0 clean package
-    mvn -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 test
+    mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive clean package
+    mvn -Pyarn -Phadoop-2.3 -Phive test
 
 The ScalaTest plugin also supports running only a specific test suite as 
follows:
 
@@ -185,16 +182,16 @@ can be set to control the SBT build. For example:
 
 Some of the tests require Spark to be packaged first, so always run `sbt/sbt 
assembly` the first time.  The following is an example of a correct (build, 
test) sequence:
 
-    sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 assembly
-    sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 test
+    sbt/sbt -Pyarn -Phadoop-2.3 -Phive assembly
+    sbt/sbt -Pyarn -Phadoop-2.3 -Phive test
 
 To run only a specific test suite as follows:
 
-    sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 "test-only 
org.apache.spark.repl.ReplSuite"
+    sbt/sbt -Pyarn -Phadoop-2.3 -Phive "test-only 
org.apache.spark.repl.ReplSuite"
 
 To run test suites of a specific sub project as follows:
 
-    sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 core/test
+    sbt/sbt -Pyarn -Phadoop-2.3 -Phive core/test
 
 # Speeding up Compilation with Zinc
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to