Github user nchammas commented on the issue:
https://github.com/apache/spark/pull/15659
From the PR description:
> figure out who owns the pyspark package name on prod PyPI (is it someone
with in the project or should we ask PyPI or should we choose a different name
to publish with like ApachePySpark?)
Don't we want to publish to `apache-spark`? Dunno if Apache has any rules
about that. For prior art, see [`apache-libcloud` on
PyPI](https://pypi.org/project/apache-libcloud/).
Btw, how did you determine that `pyspark` is taken on PyPI? We can
definitely reach out to the admins to ask if they can release the name. I'll
find out how exactly to do that.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]