[ https://issues.apache.org/jira/browse/SPARK-3266?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14114387#comment-14114387 ]
Colin B. commented on SPARK-3266: --------------------------------- So there is no method: {code} org.apache.spark.api.java.JavaDoubleRDD.max(Ljava/util/Comparator;)Ljava/lang/Double; {code} but there is a method: {code} org.apache.spark.api.java.JavaDoubleRDD.max(Ljava/util/Comparator;)Ljava/lang/Object; {code} I've heard that the return type is part of the type signature in java bytecode, so the two are different. (one returns a Double, the other an Object) This looks a bit like a scala type erasure related issue. The spark/scala code generated for JavaRDDLike includes a max method that returns an object. In JavaDoubleRDD the type is bounded to Double, so java code which calls max on JavaDoubleRDD expects a method returning Double. Since the code for max is implemented in the JavaRDDLike trait, the java code doesn't seem to inherit it correctly when types are involved. I tested making JavaRDDLike an abstract class instead of a trait. It was able to compile and run correctly. However it is not compatible with 1.0.2. > JavaDoubleRDD doesn't contain max() > ----------------------------------- > > Key: SPARK-3266 > URL: https://issues.apache.org/jira/browse/SPARK-3266 > Project: Spark > Issue Type: Bug > Components: Java API > Affects Versions: 1.0.1, 1.0.2 > Reporter: Amey Chaugule > Assignee: Josh Rosen > Attachments: spark-repro-3266.tar.gz > > > While I can compile my code, I see: > Caused by: java.lang.NoSuchMethodError: > org.apache.spark.api.java.JavaDoubleRDD.max(Ljava/util/Comparator;)Ljava/lang/Double; > When I try to execute my Spark code. Stepping into the JavaDoubleRDD class, I > don't notice max() > although it is clearly listed in the documentation. -- This message was sent by Atlassian JIRA (v6.2#6252) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org