[
https://issues.apache.org/jira/browse/SPARK-1267?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14192279#comment-14192279
]
Davies Liu commented on SPARK-1267:
-----------------------------------
Because PySpark depends on Spark packages, Python user can not use it after
'pip install pyspark', so there is not too much benefits from this.
Once we release PySpark separated from Spark, then we should keep the
compatability across versions of PySpark and Spark, it will be a nightmare for
us (we can not move fast to improve the implementation of PySpark).
So, I think we can not do this in near future. [~prabinb], do you mind to close
the PR?
> Add a pip installer for PySpark
> -------------------------------
>
> Key: SPARK-1267
> URL: https://issues.apache.org/jira/browse/SPARK-1267
> Project: Spark
> Issue Type: Improvement
> Components: PySpark
> Reporter: Prabin Banka
> Priority: Minor
> Labels: pyspark
>
> Please refer to this mail archive,
> http://mail-archives.apache.org/mod_mbox/spark-user/201311.mbox/%3CCAOEPXP7jKiw-3M8eh2giBcs8gEkZ1upHpGb=fqoucvscywj...@mail.gmail.com%3E
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]