[ https://issues.apache.org/jira/browse/SPARK-25081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16576636#comment-16576636 ]
Shixiong Zhu commented on SPARK-25081: -------------------------------------- That's possible. That's why I added the "corrnectness" label. > Nested spill in ShuffleExternalSorter may access a released memory page > ------------------------------------------------------------------------ > > Key: SPARK-25081 > URL: https://issues.apache.org/jira/browse/SPARK-25081 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.3.1 > Reporter: Shixiong Zhu > Assignee: Shixiong Zhu > Priority: Major > Labels: correctness > > This issue is pretty similar to SPARK-21907. > "allocateArray" in > [ShuffleInMemorySorter.reset|https://github.com/apache/spark/blob/9b8521e53e56a53b44c02366a99f8a8ee1307bbf/core/src/main/java/org/apache/spark/shuffle/sort/ShuffleInMemorySorter.java#L99] > may trigger a spill and cause ShuffleInMemorySorter access the released > `array`. Another task may get the same memory page from the pool. This will > cause two tasks access the same memory page. When a task reads memory written > by another task, many types of failures may happen. Here are some examples I > have seen: > - JVM crash. (This is easy to reproduce in a unit test as we fill newly > allocated and deallocated memory with 0xa5 and 0x5a bytes which usually > points to an invalid memory address) > - java.lang.IllegalArgumentException: Comparison method violates its general > contract! > - java.lang.NullPointerException at > org.apache.spark.memory.TaskMemoryManager.getPage(TaskMemoryManager.java:384) > - java.lang.UnsupportedOperationException: Cannot grow BufferHolder by size > -536870912 because the size after growing exceeds size limitation 2147483632 -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org