GitHub user skonto opened a pull request:

    https://github.com/apache/spark/pull/21930

    [SPARK-14540][Core] Fix remaining major issues for Scala 2.12 Support 

    ## What changes were proposed in this pull request?
    This PR addresses issues 2,3 in the 
[document](https://docs.google.com/document/d/1fbkjEL878witxVQpOCbjlvOvadHtVjYXeB-2mgzDTvk).
    
    * We modified the closure cleaner to identify closures that are implemented 
via the LambdaMetaFactory mechanism (serializedLambdas) (issue2). 
    
    * We also fix the issue: scala/bug#11016. There are two options for solving 
the Unit issue, either add () at the end of the closure or use the trick 
described in the doc. Otherwise overloading resolution does not work (we are 
not going to eliminate either of the methods) here. Compiler tries to adapt to 
Unit and makes these two methods candidates for overloading, when there is 
polymorphic overloading there is no ambiguity (that is the workaround 
implemented). This does not look that good but it serves its purpose as we need 
to support two different uses for method: `addTaskFailureListener`. One that 
passes a TaskCompletionListener and one that passes a closure that is wrapped 
with a TaskCompletionListener later on (issue3).
    
    Note: regarding issue 1 the plan is:
    
    > Do Nothing. Don’t try to fix this as this is only a problem for Java 
users who would want to use 2.11 binaries. In that case they can cast to 
MapFunction to be able to utilize lambdas. In Spark 3.0.0 the API should be 
simplified so that this issue is removed.
    
    ## How was this patch tested?
    This was manually tested:
    ```./dev/change-scala-version.sh 2.12
    ./build/mvn -DskipTests -Pscala-2.12 clean package
    ./build/mvn -Pscala-2.12 clean package 
-DwildcardSuites=org.apache.spark.serializer.ProactiveClosureSerializationSuite 
-Dtest=None
    ./build/mvn -Pscala-2.12 clean package 
-DwildcardSuites=org.apache.spark.util.ClosureCleanerSuite -Dtest=None
    ./build/mvn -Pscala-2.12 clean package 
-DwildcardSuites=org.apache.spark.streaming.DStreamClosureSuite -Dtest=None```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/skonto/spark scala2.12-sup

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21930.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21930
    
----
commit d466a9c2529e5fe1a680b6bab6b80fd57bef8c35
Author: Stavros Kontopoulos <stavros.kontopoulos@...>
Date:   2018-07-30T22:51:14Z

    initial

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to