Github user gaborgsomogyi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20888#discussion_r181459717
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/DataFrameRangeSuite.scala ---
    @@ -152,22 +154,28 @@ class DataFrameRangeSuite extends QueryTest with 
SharedSQLContext with Eventuall
       }
     
       test("Cancelling stage in a query with Range.") {
    +    val slices = 10
    +
         val listener = new SparkListener {
    -      override def onJobStart(jobStart: SparkListenerJobStart): Unit = {
    -        eventually(timeout(10.seconds), interval(1.millis)) {
    -          assert(DataFrameRangeSuite.stageToKill > 0)
    +      override def onTaskStart(taskStart: SparkListenerTaskStart): Unit = {
    +        eventually(timeout(10.seconds)) {
    +          assert(DataFrameRangeSuite.isTaskStarted)
             }
    -        sparkContext.cancelStage(DataFrameRangeSuite.stageToKill)
    +        sparkContext.cancelStage(taskStart.stageId)
    +        DataFrameRangeSuite.semaphore.release(slices)
    --- End diff --
    
    The concept is clear.
    ```
    sparkContext.range(0, 10000L, numSlices = slices).mapPartitions { x =>
      synchronized { wait() }
      x
    }.toDF("id").agg(sum("id")).collect()
    ```
    throws:
    ```
    Expected the cause to be SparkException, got 
java.io.NotSerializableException: org.scalatest.Assertions$AssertionsHelper
    Serialization stack:
        - object not serializable (class: 
org.scalatest.Assertions$AssertionsHelper, value: 
org.scalatest.Assertions$AssertionsHelper@aced190)
    ...
    ```
    I mean this.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to