[ https://issues.apache.org/jira/browse/SPARK-5687?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Lianhui Wang updated SPARK-5687: -------------------------------- Summary: in TaskResultGetter need to catch OutOfMemoryError. (was: in TaskResultGetter need to catch OutOfMemoryError and report failed when it cannot fetch results.) > in TaskResultGetter need to catch OutOfMemoryError. > --------------------------------------------------- > > Key: SPARK-5687 > URL: https://issues.apache.org/jira/browse/SPARK-5687 > Project: Spark > Issue Type: Bug > Components: Spark Core > Reporter: Lianhui Wang > > because in enqueueSuccessfulTask there is another thread to fetch result, if > result is very large,it maybe throw a OutOfMemoryError. so if we donot catch > OutOfMemoryError, DAGDAGScheduler donot know the status of this task. > another is: > when totalResultSize > maxResultSize and it cannot fetch large result, we > need to report failed status to DAGDAGScheduler. if it donot report any > status of this tasks.DAGDAGScheduler also donot know status of this task. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org