An account already exists, the PMC has the info for it. I think we will
need to wait for the 2.2 artifacts to do the actual PyPI upload because of
the local version string in 2.2.1, but rest assured this isn't something
I've lost track of.
On Wed, May 24, 2017 at 12:11 AM Xiao Li wrote:
> Hi, Ho
Hi, Holden,
Based on the PR, https://github.com/pypa/packaging-problems/issues/90 , the
limit has been increased to 250MB.
Just wondering if we can publish PySpark to PyPI now? Have you created the
account?
Thanks,
Xiao Li
2017-05-12 11:35 GMT-07:00 Sameer Agarwal :
> Holden,
>
> Thanks aga
On 05/23/2017 02:45 PM, Mendelson, Assaf wrote:
>
> You are correct,
>
> I actually did not look too deeply into it until now as I noticed you
> mentioned it is compatible with python 3 only and I saw in the github
> that mypy or pytype is required.
>
>
>
> Because of that I made my suggestions
It doesn't break anything at all. You can take stub files as-is, put
these into PySpark root, and as long as users are not interested in type
checking, it won't have any runtime impact.
Surprisingly the current MyPy build (mypy==0.511) reports only one
incompatibility with Python 2 (dynamic metacl
Hi all,
I read some paper about the stage, l know the narrow dependency and shuffle
dependency.
About the belowing RDD DAG, how deos spark generate the stage DAG please?
And is this RDD DAG legal please?<>
-
To unsubscrib
Actually there is, at least for pycharm. I actually opened a jira on it
(https://issues.apache.org/jira/browse/SPARK-17333). It describes two way of
doing it (I also made a github stub at:
https://github.com/assafmendelson/ExamplePysparkAnnotation). Unfortunately, I
never found the time to foll
Seems useful to do. Is there a way to do this so it doesn't break Python
2.x?
On Sun, May 14, 2017 at 11:44 PM, Maciej Szymkiewicz wrote:
> Hi everyone,
>
> For the last few months I've been working on static type annotations for
> PySpark. For those of you, who are not familiar with the idea,