GitHub user bersprockets opened a pull request:
https://github.com/apache/spark/pull/21628
[SPARK-23776][DOC] Update instructions for running PySpark after building
with SBT
## What changes were proposed in this pull request?
This update tells the reader how to build Spark with SBT such that
pyspark-sql tests will succeed.
If you follow the current instructions for building Spark with SBT,
pyspark/sql/udf.py fails with:
<pre>
AnalysisException: u'Can not load class
test.org.apache.spark.sql.JavaStringLength, please make sure it is on the
classpath;'
</pre>
## How was this patch tested?
I ran the doc build command (SKIP_API=1 jekyll build) and eyeballed the
result.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/bersprockets/spark SPARK-23776_doc
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/21628.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #21628
----
commit 9fcd05d7cb52a68bea930625605013397b4989f6
Author: Bruce Robbins <bersprockets@...>
Date: 2018-06-25T02:07:12Z
Update build doc for running pyspark after building with sbt
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]