Buck created SPARK-5929:
---------------------------
Summary: Pyspark: Register a pip requirements file with
spark_context
Key: SPARK-5929
URL: https://issues.apache.org/jira/browse/SPARK-5929
Project: Spark
Issue Type: Improvement
Components: PySpark
Reporter: Buck
Priority: Minor
I've been doing a lot of dependency work with shipping dependencies to workers
as it is non-trivial for me to have my workers include the proper dependencies
in their own environments.
To circumvent this, I added a addRequirementsFile() method that takes a pip
requirements file, downloads the packages, repackages them to be registered
with addPyFiles and ship them to workers.
Here is a comparison of what I've done on the Palantir fork
https://github.com/buckheroux/spark/compare/palantir:master...master
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]