[ 
https://issues.apache.org/jira/browse/SPARK-1834?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14156477#comment-14156477
 ] 

Sean Owen commented on SPARK-1834:
----------------------------------

Weird, I can reproduce this. I have a new test case for {{JavaAPISuite}} and am 
investigating. It compiles fine but fails at runtime. I sense Scala shenanigans.

> NoSuchMethodError when invoking JavaPairRDD.reduce() in Java
> ------------------------------------------------------------
>
>                 Key: SPARK-1834
>                 URL: https://issues.apache.org/jira/browse/SPARK-1834
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.1
>         Environment: Redhat Linux, Java 7, Hadoop 2.2, Scala 2.10.4
>            Reporter: John Snodgrass
>
> I get a java.lang.NoSuchMethod error when invoking JavaPairRDD.reduce(). Here 
> is the partial stack trace:
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:601)
>         at 
> org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:39)
>         at 
> org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
> Caused by: java.lang.NoSuchMethodError: 
> org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
>         at JavaPairRDDReduceTest.main(JavaPairRDDReduceTest.java:49)    ...
> I'm using Spark 0.9.1. I checked to ensure that I'm compiling with the same 
> version of Spark as I am running on the cluster. The reduce() method works 
> fine with JavaRDD, just not with JavaPairRDD. Here is a code snippet that 
> exhibits the problem: 
>       ArrayList<Integer> array = new ArrayList<>();
>       for (int i = 0; i < 10; ++i) {
>         array.add(i);
>       }
>       JavaRDD<Integer> rdd = javaSparkContext.parallelize(array);
>       JavaPairRDD<String, Integer> testRDD = rdd.map(new 
> PairFunction<Integer, String, Integer>() {
>         @Override
>         public Tuple2<String, Integer> call(Integer t) throws Exception {
>           return new Tuple2<>("" + t, t);
>         }
>       }).cache();
>       
>       testRDD.reduce(new Function2<Tuple2<String, Integer>, Tuple2<String, 
> Integer>, Tuple2<String, Integer>>() {
>         @Override
>         public Tuple2<String, Integer> call(Tuple2<String, Integer> arg0, 
> Tuple2<String, Integer> arg1) throws Exception { 
>           return new Tuple2<>(arg0._1 + arg1._1, arg0._2 * 10 + arg0._2);
>         }
>       });



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to