[
https://issues.apache.org/jira/browse/SPARK-13587?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15310782#comment-15310782
]
Greg Bowyer commented on SPARK-13587:
-------------------------------------
.... I have been out of this world for a long time.
The Spex extension was designed to get around this, and was designed for a
world where NFS was being removed from a cluster. Right now its a ugly ugly
hack but I might be able to spend some time making it less of a hack.
If this was integrated to spark would we want to make this transparent, or
provide a flag to the launcher scripts. I figure a --py-deps=requirements.txt
might work?
> Support virtualenv in PySpark
> -----------------------------
>
> Key: SPARK-13587
> URL: https://issues.apache.org/jira/browse/SPARK-13587
> Project: Spark
> Issue Type: New Feature
> Components: PySpark
> Reporter: Jeff Zhang
>
> Currently, it's not easy for user to add third party python packages in
> pyspark.
> * One way is to using --py-files (suitable for simple dependency, but not
> suitable for complicated dependency, especially with transitive dependency)
> * Another way is install packages manually on each node (time wasting, and
> not easy to switch to different environment)
> Python has now 2 different virtualenv implementation. One is native
> virtualenv another is through conda. This jira is trying to migrate these 2
> tools to distributed environment
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]