Github user davies commented on a diff in the pull request:
https://github.com/apache/spark/pull/2563#discussion_r18432669
--- Diff: python/run-tests ---
@@ -60,56 +60,58 @@ fi
echo "Testing with Python version:"
$PYSPARK_PYTHON --version
-run_test "pyspark/rdd.py"
-run_test "pyspark/context.py"
-run_test "pyspark/conf.py"
run_test "pyspark/sql.py"
-# These tests are included in the module-level docs, and so must
--- End diff --
you can setup path in bashrc:
```
export SPARK_HOME=path_to_spark
export
PYTHONPATH=${SPARK_HOME}/python/:${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip
```
then you could run any pyspark jobs directly with python (or run single
test)
```
python python/pyspark/sql.py
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]