Github user mridulm commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17166#discussion_r107232894
  
    --- Diff: core/src/main/scala/org/apache/spark/TaskContextImpl.scala ---
    @@ -140,16 +140,22 @@ private[spark] class TaskContextImpl(
       }
     
       /** Marks the task for interruption, i.e. cancellation. */
    -  private[spark] def markInterrupted(): Unit = {
    -    interrupted = true
    +  private[spark] def markInterrupted(reason: String): Unit = {
    +    maybeKillReason = Some(reason)
    +  }
    +
    +  private[spark] override def killTaskIfInterrupted(): Unit = {
    +    if (maybeKillReason.isDefined) {
    +      throw new TaskKilledException(maybeKillReason.get)
    --- End diff --
    
    This is not thread safe - while technically we do not allow kill reason to 
be reset to None right now and might be fine, it can lead to future issues.
    
    Either make all access/updates to kill reason synchronized; or capture 
`maybeKillReason` to a local variable and use that in the `if` and `throw`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to