Github user MJFND commented on the issue:
https://github.com/apache/spark/pull/14658
Okay, but even if not then increasing the number of shuffle partition
should fix it, but its not.
On Dec 26, 2017 8:51 PM, "Guoqiang Li" <[email protected]> wrote:
> Spark 2.2 has fixed this issue.
>
> â
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/14658#issuecomment-354035866>, or
mute
> the thread
>
<https://github.com/notifications/unsubscribe-auth/AL3brk9bTL3iud5dL_CFg_SoEaUtJ9g1ks5tEaKIgaJpZM4JlBgu>
> .
>
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]