[ https://issues.apache.org/jira/browse/SPARK-5363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14320987#comment-14320987 ]
Davies Liu commented on SPARK-5363: ----------------------------------- [~TJKlein] Could you try the patch in https://github.com/apache/spark/pull/4601 whether it fix your problem? > Spark 1.2 freeze without error notification > ------------------------------------------- > > Key: SPARK-5363 > URL: https://issues.apache.org/jira/browse/SPARK-5363 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 1.2.0 > Reporter: Tassilo Klein > Assignee: Davies Liu > Priority: Critical > > After a number of calls to a map().collect() statement Spark freezes without > reporting any error. Within the map a large broadcast variable is used. > The freezing can be avoided by setting 'spark.python.worker.reuse = false' > (Spark 1.2) or using an earlier version, however, at the prize of low speed. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org