vrozov commented on code in PR #52139: URL: https://github.com/apache/spark/pull/52139#discussion_r2305322051
########## core/src/main/scala/org/apache/spark/util/UninterruptibleThread.scala: ########## @@ -101,7 +101,11 @@ private[spark] class UninterruptibleThread( * is true) and no concurrent [[interrupt()]] call, otherwise false */ def isInterruptible: Boolean = synchronized { - shouldInterruptThread = uninterruptible + // SPARK-53394: We should not interrupt the thread when it is already interrupted. + // Otherwise, the state of `shouldInterruptThread` becomes inconsistent between Review Comment: nit: this is code comment, not a doc comment, using Markdown formatting reduces readability IMO. ########## core/src/main/scala/org/apache/spark/util/UninterruptibleThread.scala: ########## @@ -101,7 +101,11 @@ private[spark] class UninterruptibleThread( * is true) and no concurrent [[interrupt()]] call, otherwise false */ def isInterruptible: Boolean = synchronized { - shouldInterruptThread = uninterruptible + // SPARK-53394: We should not interrupt the thread when it is already interrupted. + // Otherwise, the state of `shouldInterruptThread` becomes inconsistent between + // `isInterruptible()` and `isInterruptPending()`, leading to `UninterruptibleThread` + // be interruptible under `runUninterruptibly`. + shouldInterruptThread = uninterruptible || UninterruptibleThread.this.isInterrupted Review Comment: I think `isInterrupted` can be moved to line 115: ``` if (!shouldInterruptThread && !awaitInterruptThread && !isInterrupted) { ... ``` along with updating comment "there is no other threads that called interrupt" to cover both conditions: "awaitInterruptThread is already true or isInterrupted is true. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org