Github user skonto commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21930#discussion_r206562302
  
    --- Diff: core/src/main/scala/org/apache/spark/TaskContext.scala ---
    @@ -123,7 +123,7 @@ abstract class TaskContext extends Serializable {
        *
        * Exceptions thrown by the listener will result in failure of the task.
        */
    -  def addTaskCompletionListener(f: (TaskContext) => Unit): TaskContext = {
    +  def addTaskCompletionListener[U](f: (TaskContext) => U): TaskContext = {
    --- End diff --
    
    Yes this covers that bug. So if you build with 2.11 you dont need to 
specify the type Unit (I tried that) when you make the call since there is no 
ambiguity, compiler does not face an overloading issue. With 2.12 both  
addTaskCompletionListener methods end up to be SAM types and the Unit adaption 
causes this issue. I am not sure we can do anything more here. @retronym or 
@lrytz may add more context.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to