Github user tdas commented on the issue:

    https://github.com/apache/spark/pull/13996
  
    Let me note down all the concerns clearly.
    
    1. Compatibility with abstract class vs trait: Let me clarify. When 
abstract class, you can later add defined methods which the users can override 
if needed, but does not break compatibility of existing implementations. For 
example, if there is a class `abstract class A { public void func() }`, you can 
make it `abstract class A { public void func(); public void newFunc() { } }`. 
Existing implementations of A will not break at compile time, and if the 
default implementation of `newFunc()` is done correctly, then it will maintain 
the earlier behavior at runtime. No runtime failures. This is not possible with 
scala `trait`. The moment you add a defined method in a trait, it compiles down 
a java interface AND a similarly named java class - which becomes painful for 
Java users, no compatibility whatsoever.
    
    2. Language (Scala vs Java): I did the refactoring with Java because I knew 
that it will obviously work in Java as there is no more Scala magic anywhere. 
But I just did a quick test with scala 
    ```
    abstract class A {
      def func()
      def newFunc() { }   // class B below compiles fine with and without this 
method.
    }
    
    object A {
      def staticFunc() { }
    }
    
    class B(val x: String) extends A {
      override def func() { }
    }
    ```
    The resultant public interfaces are as expected. 
    ```
    $ javap A.class
    Compiled from "MapConverter.scala"
    public abstract class org.apache.spark.streaming.kafka010.A {
      public static void staticFunc();
      public abstract void func();
      public void newFunc();
      public org.apache.spark.streaming.kafka010.A();
    }
    ```
    So this should be fine if written in Scala. 
    
    3. Style: This is are more fuzzy thing. I agree that this breaks the style 
guide a little, but we are doing so in specific cases to ensure that the public 
API looks a certains. Case in point `OutputMode` in `sql.streaming`. This 
unusual style make the API look enum-ish but allows parameterized enum-ish 
values.
    
    Your PR #13998 is going in the right direction. I am stuck for the next 
couple of hours in meetings. So if you dont mind could you update your PR?
    
    In addition, for testing, I noticed that the JavaConsumerStrategySuite used 
`ju.HashMap[TopicPartition, Object]` for offsets, and not 
`ju.HashMap[TopicPartition, java.lang.Long]`. Better to test with the latter, 
as that is what we would expect the user to have as a parameter. You can pull 
in my changes in this PR as much or as little you want.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to