Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18645#discussion_r128891907
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala 
---
    @@ -353,7 +353,7 @@ class DatasetSuite extends QueryTest with 
SharedSQLContext {
       test("foreachPartition") {
         val ds = Seq(("a", 1), ("b", 2), ("c", 3)).toDS()
         val acc = sparkContext.longAccumulator
    -    ds.foreachPartition(_.foreach(v => acc.add(v._2)))
    +    ds.foreachPartition((it: Iterator[(String, Int)]) => it.foreach(v => 
acc.add(v._2)))
    --- End diff --
    
    No, this is just clarifying to the compiler which of two overloads to 
invoke, because two actually match and it's an error in 2.12.
    
    It's going to be a problem for end-user apps in exactly the same way, yes. 
They would need similar changes to work with Spark in Scala 2.12 (and, 
probably, other code with similar characteristics in 2.12)
    
    This is really https://issues.apache.org/jira/browse/SPARK-14643 and it 
doesn't necessarily block making these updates such that a 2.12 build is 
possible, but there's still the question of whether it's time to release a 2.12 
build if it will require these changes. Then again, different Scala versions 
have always needed different builds.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to