hsdp wrote:
> i fixed it.
>
> i make my sbt project depend on
> spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
> and it works
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSu
gt; scalaVersion := "2.10.4"
> >
> > libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"
> >
> > resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
> >
> > is there something
i fixed it.
i make my sbt project depend on
spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
and it works
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
Sent from the
message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5094.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
esolvers += "Akka Repository" at "http://repo.akka.io/releases/";
>
> is there something need to modify?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
ot;0.9.1"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
is there something need to modify?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5076.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
I am seeing the following exception from a very basic test project when it
runs on spark local.
java.lang.NoSuchMethodError:
org.apache.spark.api.java.JavaPairRDD.reduce(Lorg/apache/spark/api/java/function/Function2;)Lscala/Tuple2;
The project is built with Java 1.6, Scala 2.10.3 and spark 0.9.1