Github user squito commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20987#discussion_r179776576
  
    --- Diff: core/src/test/scala/org/apache/spark/executor/ExecutorSuite.scala 
---
    @@ -330,6 +362,15 @@ class FetchFailureHidingRDD(
           case t: Throwable =>
             if (throwOOM) {
               throw new OutOfMemoryError("OOM while handling another 
exception")
    +        } else if (interrupt) {
    +          // make sure our test is setup correctly
    +          
assert(TaskContext.get().asInstanceOf[TaskContextImpl].fetchFailed.isDefined)
    +          // signal our test is ready for the task to get killed
    --- End diff --
    
    I prefer the original comment -- the mechanics of what is waiting on the 
latch are easy enough to follow, its more important to explain why.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to