Github user MLnick commented on the issue:
https://github.com/apache/spark/pull/13540
Also, yes we should actually look at copying docs from Scala side where
appropriate. We can add that to the QA JIRAs for Python doc.
On Tue, 7 Jun 2016 at 11:28, Manoj Kumar <[email protected]> wrote:
> LGTM as well. pending the nitpick by @BryanCutler
> <https://github.com/BryanCutler>
>
> Not related, but it's been a while since I hacked on Spark or PySpark but
> at some point do we need better docs for PySpark? I couldn't figure out
how
> the IDF's are calculated without looking at the Scala documentation.
>
> â
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/13540#issuecomment-224371061>, or
mute
> the thread
>
<https://github.com/notifications/unsubscribe/AA_SB_BQRxelQdfaS3wy5ld9p8efsmwnks5qJbhpgaJpZM4IvqPP>
> .
>
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]