I am seeing the following exception from a very basic test project when it
runs on spark local.

java.lang.NoSuchMethodError:
org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;

The project is built with Java 1.6, Scala 2.10.3 and spark 0.9.1

The error occurs on mapped.reduce() below.

The code is quite simple:

JavaSparkContext sc = new JavaSparkContext("local[4]", "My Test App");

List<String> rd = buildData();

JavaRDD<String> data = sc.parallelize(rd);


JavaPairRDD<String, List<String>> mapped = data.map(new
PairFunction<String, String, List<String>>() {
@Override
public Tuple2<String, List<String>> call(String value) throws Exception {

// randomly assign matches values between 1 and 4
return new Tuple2<String, List<String>>(value, matches);
}
});

mapped.reduce(new Function2<Tuple2<String, List<String>>, Tuple2<String,
List<String>>, Tuple2<String, List<String>>>() {
@Override
public Tuple2<String, List<String>> call(Tuple2<String, List<String>> t1,
Tuple2<String, List<String>> t2) throws Exception {
System.err.println("REDUCING " + t1 + " with " + t2);
return t1;
}
});



-- 
Jared Rodriguez

Reply via email to