Github user skonto commented on the issue:

    https://github.com/apache/spark/pull/21930
  
    @felixcheung I tested with this simple program with Scala 2.11. The app jar 
was built against the officially released artifacts (2.3.1):
    ```
        val spark = SparkSession
          .builder
          .appName("Test compat")
          .getOrCreate()
    
          spark.sparkContext.makeRDD(1 to 10000).foreachPartition{
          x => TaskContext.get().addTaskCompletionListener{
            y: TaskContext => 
println(s"Finishing...${y.partitionId()}...${x.length}")
    
          }
            x
        }
    ```
    I run this app with the 2.3.1 official distro and also by building a distro 
from this PR.
    I got no errors, both cases output is:
    ```
    Finishing...3...1250
    Finishing...0...1250
    Finishing...4...1250
    Finishing...7...1250
    Finishing...5...1250
    Finishing...6...1250
    Finishing...1...1250
    Finishing...2...1250
    ```
    The definition of binary compatibility states that when you change a class, 
client code using that class should not break. I hope I am not missing 
something here.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to