[
https://issues.apache.org/jira/browse/SPARK-22406?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16248405#comment-16248405
]
Holden Karau commented on SPARK-22406:
--------------------------------------
Yes, although this should be fixed in the documented upload process, it
just has to be run at the end of the release to be verified closed.
On Fri, Nov 10, 2017 at 10:40 PM Felix Cheung (JIRA) <[email protected]>
--
Cell : 425-233-8271
> pyspark version tag is wrong on PyPi
> ------------------------------------
>
> Key: SPARK-22406
> URL: https://issues.apache.org/jira/browse/SPARK-22406
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.2.0
> Reporter: Kerrick Staley
> Assignee: holdenk
> Priority: Minor
>
> On pypi.python.org, the pyspark package is tagged with version
> {{2.2.0.post0}}: https://pypi.python.org/pypi/pyspark/2.2.0
> However, when you install the package, it has version {{2.2.0}}.
> This has really annoying consequences: if you try {{pip install
> pyspark==2.2.0}}, it won't work. Instead you have to do {{pip install
> pyspark==2.2.0.post0}}. Then, if you later run the same command ({{pip
> install pyspark==2.2.0.post0}}), it won't recognize the existing pyspark
> installation (because it has version {{2.2.0}}) and instead will reinstall
> it, which is very slow because pyspark is a large package.
> This can happen if you add a new package to a {{requirements.txt}} file; you
> end up waiting a lot longer than necessary because every time you run {{pip
> install -r requirements.txt}} it reinstalls pyspark.
> Can you please change the package on PyPi to have the version {{2.2.0}}?
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]