[ 
https://issues.apache.org/jira/browse/SPARK-18128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15631463#comment-15631463
 ] 

holdenk commented on SPARK-18128:
---------------------------------

Extracted from the discussion around SPARK-1267:

People who are officially allowed to make releases will need to register on 
PyPI and PyPI test, create .pypirc files with their credentials and be added to 
the "pyspark" or "apache-pyspark" project (depending on the name that is 
chosen) and the release script will need to be updated slightly. Code wise the 
changes required for SPARK-18128 are relatively minor, whatever changing of 
package name may be required, and adding a shell variable to control which PyPI 
server is being published to, and during publish switching sdist to sdist 
upload.

> Add support for publishing to PyPI
> ----------------------------------
>
>                 Key: SPARK-18128
>                 URL: https://issues.apache.org/jira/browse/SPARK-18128
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: holdenk
>
> After SPARK-1267 is done we should add support for publishing to PyPI similar 
> to how we publish to maven central.
> Note: one of the open questions is what to do about package name since 
> someone has registered the package name PySpark on PyPI - we could use 
> ApachePySpark or we could try and get find who registered PySpark and get 
> them to transfer it to us (since they haven't published anything so maybe 
> fine?)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to