Github user HyukjinKwon commented on the issue:
https://github.com/apache/spark/pull/19782
This is also partly for running Python coverage without extra code change.
I know a hacky way to run this (see
https://github.com/apache/spark/pull/19630#issuecomment-345490662 and
https://github.com/apache/spark/pull/19630#issuecomment-345171997):
Now, we can do, for example, as below:
```
pip install coverage
# Build Spark (http://spark.apache.org/docs/latest/building-spark.html)
rm python/lib/pyspark.zip
rm -fr .coverage
rm -fr coverage_html
echo "spark.python.use.daemon false" >> conf/spark-defaults.conf
echo "
#!/usr/bin/env bash
coverage run -p \$@
" > coverage_python
chmod 755 coverage_python
# Run actual Python tests
PATH=`pwd`:$PATH PYSPARK_PYTHON=coverage_python SPARK_TESTING=1 bin/pyspark
pyspark.sql.tests VectorizedUDFTests
rm conf/spark-defaults.conf
coverage combine
coverage html -d coverage_html -i
open coverage_html
# Open up index.html in your browser.
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]