[
https://issues.apache.org/jira/browse/SPARK-3266?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Josh Rosen updated SPARK-3266:
------------------------------
Assignee: (was: Josh Rosen)
I'm unassigning myself from this since I'm no longer actively working on it
(although I really want to fix this as soon as I have time).
Copying a [comment from one of my pull
requests|https://github.com/apache/spark/pull/2951#issuecomment-63769092]:
{quote}
@viper-kun No, I'm not actively working on this. A pull request here would be
very welcome, since this is an annoying bug. If you're planning to work on
this, make sure to include the extra test cases that I added to JavaAPISuite;
these tests should be useful regardless of what approach you taking to fixing
these bugs.
>From a binary compatibility standpoint, it's important to keep the Java*Like
>interfaces since there's some code in the wild that uses these interfaces to
>abstract over the different implementations. Removing default implementations
>from traits technically breaks compatibility for anyone who might have
>extended those traits, but I don't think that should be a huge concern /
>likely problem.
If you want to avoid copying / moving everything around, then I think it would
be sufficient to just identify the methods that are affected by this compiler
bug and copy only those methods to each subclass. We could maintain 100% binary
compatibility with this approach, even for the obscure case where someone
extended a trait, and it might make it easier to backport the fix to
maintenance branches, but it also seems sort of risk-prone because someone
might add new default implementations in the trait.
For the sake of keeping the discussion in one place, let's chat about
alternative designs on the JIRA ticket. I'll unassign myself from it and copy
this comment over there.
{quote}
> JavaDoubleRDD doesn't contain max()
> -----------------------------------
>
> Key: SPARK-3266
> URL: https://issues.apache.org/jira/browse/SPARK-3266
> Project: Spark
> Issue Type: Bug
> Components: Java API
> Affects Versions: 1.0.1, 1.0.2, 1.1.0, 1.2.0
> Reporter: Amey Chaugule
> Attachments: spark-repro-3266.tar.gz
>
>
> While I can compile my code, I see:
> Caused by: java.lang.NoSuchMethodError:
> org.apache.spark.api.java.JavaDoubleRDD.max(Ljava/util/Comparator;)Ljava/lang/Double;
> When I try to execute my Spark code. Stepping into the JavaDoubleRDD class, I
> don't notice max()
> although it is clearly listed in the documentation.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]