Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/21628#discussion_r197673014
--- Diff: docs/building-spark.md ---
@@ -215,19 +215,23 @@ If you are building Spark for use in a Python
environment and you wish to pip in
Alternatively, you can also run make-distribution with the --pip option.
-## PySpark Tests with Maven
+## PySpark Tests with Maven or SBT
If you are building PySpark and wish to run the PySpark tests you will
need to build Spark with Hive support.
./build/mvn -DskipTests clean package -Phive
./python/run-tests
+If you are building PySpark with SBT and wish to run the PySpark tests,
you will need to build Spark with Hive support and also build the test
components:
+
+ ./build/sbt -Phive clean package
+ ./build/sbt sql/test:compile
--- End diff --
Hm, shouldn't we better compile other tests too?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]