[ 
https://issues.apache.org/jira/browse/SPARK-8202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14578425#comment-14578425
 ] 

Davies Liu commented on SPARK-8202:
-----------------------------------

Workaround: increase the number of partitions during sort, or increase memory 
of python worker by: spark.python.worker.memory (default is 512M)

> PySpark: infinite loop during external sort 
> --------------------------------------------
>
>                 Key: SPARK-8202
>                 URL: https://issues.apache.org/jira/browse/SPARK-8202
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.4.0
>            Reporter: Davies Liu
>            Assignee: Davies Liu
>            Priority: Critical
>
> The batch size during external sort will grow up to max 10000, then shrink 
> down to zero, causing infinite loop.
> Given the assumption that the items usually have similar size, so we don't 
> need to adjust the batch size after first spill.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to