Github user koeninger commented on the issue:
https://github.com/apache/spark/pull/13996
I'll leave line comments about specific things, but my major overarching
concern is about moving the interface to a Java abstract class.
As far as I can tell, nothing being done in that class couldn't be done in
Scala. If this is just a reaction to not wanting autogenerated apply methods,
nothing's stopping us from making a standalone non-companion object with the
exact same interface you're proposing.
More important in my opinion is the reasoning behind moving to an abstract
class. Nothing about the definition of that interface requires constructor
state, so a class doesn't make sense.
If your thinking here is that you want to be able to later add methods with
a default implementation without relying on Java 8 features... I'm really
opposed to this line of reasoning. This is prioritizing binary compatibility
at the cost of silently breaking people's code when APIs really do need to
change.
If the interface needs to change, and a bunch of people have already
implemented that interface in their own code, which would you rather have as a
user? Know at compile time that this interface, (which is marked as
experimental) needs to change, figure out how it changed, and write the right
thing for your use case? Or silently have it "work"... until you find out at
runtime (or in a business meeting about why metrics are now wrong) that the
default implementation the spark project added does totally the wrong thing for
your use case?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]