Github user nchammas commented on the issue:
https://github.com/apache/spark/pull/15659
I tested this out with Python 3 on my system with the following commands:
```
# Inside ./spark/.
python3 -m venv venv
source venv/bin/activate
./dev/make-distribution.sh --pip
pip install -e ./python/
which pyspark
pyspark
```
Seems there is a bug with how `SPARK_HOME` is computed:
```
[make-distribution.sh output snipped]
$ pip install -e ./python/
Obtaining file:///.../apache/spark/python
Collecting py4j==0.10.4 (from pyspark==2.1.0.dev1)
Downloading py4j-0.10.4-py2.py3-none-any.whl (186kB)
100%
|ââââââââââââââââââââââââââââââââ|
194kB 2.0MB/s
Installing collected packages: py4j, pyspark
Running setup.py develop for pyspark
Successfully installed py4j-0.10.4 pyspark
$ which pyspark
.../apache/spark/venv/bin/pyspark
$ pyspark
Could not find valid SPARK_HOME while searching <map object at 0x102bc15f8>
.../apache/spark/venv/bin/pyspark: line 24: None/bin/load-spark-env.sh: No
such file or directory
.../apache/spark/venv/bin/pyspark: line 77:
.../apache/spark/None/bin/spark-submit: No such file or directory
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]