GitHub user jsnowacki opened a pull request:
https://github.com/apache/spark/pull/19807
[SPARK-22495] Fix setup of SPARK_HOME variable on Windows
## What changes were proposed in this pull request?
This is a cherry pick of the original PR 19370 onto branch-2.2 as suggested
in https://github.com/apache/spark/pull/19370#issuecomment-346526920.
Fixing the way how `SPARK_HOME` is resolved on Windows. While the previous
version was working with the built release download, the set of directories
changed slightly for the PySpark `pip` or `conda` install. This has been
reflected in Linux files in `bin` but not for Windows `cmd` files.
First fix improves the way how the `jars` directory is found, as this was
stoping Windows version of `pip/conda` install from working; JARs were not
found by on Session/Context setup.
Second fix is adding `find-spark-home.cmd` script, which uses
`find_spark_home.py` script, as the Linux version, to resolve `SPARK_HOME`. It
is based on `find-spark-home` bash script, though, some operations are done in
different order due to the `cmd` script language limitations. If environment
variable is set, the Python script `find_spark_home.py` will not be run. The
process can fail if Python is not installed, but it will mostly use this way if
PySpark is installed via `pip/conda`, thus, there is some Python in the system.
## How was this patch tested?
Tested on local installation.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jsnowacki/spark-1 fix_spark_cmds_2
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/19807.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #19807
----
commit bd24e470227437a52b07d95335579f1afcdda905
Author: Jakub Nowacki <[email protected]>
Date: 2017-10-06T12:06:15Z
[SPARK-22495] Fix setup of SPARK_HOME variable on Windows
(cherry picked from commit b58f740)
----
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]