Sital Kedia created SPARK-22827:
-----------------------------------

             Summary: Avoid throwing OutOfMemoryError in case of exception in 
spill
                 Key: SPARK-22827
                 URL: https://issues.apache.org/jira/browse/SPARK-22827
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Sital Kedia


Currently, the task memory manager throws an OutofMemory error when there is an 
IO exception happens in spill() - 
https://github.com/apache/spark/blob/master/core/src/main/java/org/apache/spark/memory/TaskMemoryManager.java#L194.
 Similarly there any many other places in code when if a task is not able to 
acquire memory due to an exception we throw an OutofMemory error which kills 
the entire executor and hence failing all the tasks that are running on that 
executor instead of just failing one single task. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to