GitHub user alope107 opened a pull request:
https://github.com/apache/spark/pull/8318
[SPARK-1267][PYSPARK] Adds pip installer for pyspark
Adds a setup.py so that pyspark can be installed and packaged for pip.
This allows for easier setup, and declaration of dependencies. Please see this
discussion for more of the rationale behind this PR:
http://apache-spark-developers-list.1001551.n3.nabble.com/PySpark-on-PyPi-td12626.html
It is enforced at runtime that there must be a valid SPARK_HOME set, and
that the version of pyspark and spark must match exactly.
To be used with pip, the package will need to be registered and uploaded to
PyPi, see:
https://docs.python.org/2/distutils/packageindex.html
This code is based on a PR by @prabinb that I've updated due to renewed
interest, see:
https://github.com/apache/spark/pull/464
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/alope107/spark pip-installer
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/8318.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #8318
----
commit a288923d7600055b9af346ed74f88c7be598fbb1
Author: Auberon Lopez <[email protected]>
Date: 2015-08-19T19:00:46Z
[SPARK-1267][PYSPARK] Adds pip installer for pyspark
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]