Hi,
One thing you can do is set the spark version your project depends on
to "1.0.0-SNAPSHOT" (make sure it matches the version of Spark you're
building); then before building your project, run "sbt publishLocal"
on the Spark tree.
On Wed, Apr 30, 2014 at 12:11 AM, wxhsdp wrote:
> i fixed it.
>
On 30 Apr 2014 06:59, "Patrick Wendell" wrote:
>
> The signature of this function was changed in spark 1.0... is there
> any chance that somehow you are actually running against a newer
> version of Spark?
>
> On Tue, Apr 29, 2014 at 8:58 PM, wxhsdp wrote:
> > i met with the same question when up
i fixed it.
i make my sbt project depend on
spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
and it works
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
Sent from the A
Hi, patrick
i checked out https://github.com/apache/spark/ this morning and built
/spark/trunk
with ./sbt/sbt assembly
is it spark 1.0?
so how can i update my sbt file? the latest version in
http://repo1.maven.org/maven2/org/apache/spark/
is 0.9.1
thank you for your help
--
View this message
The signature of this function was changed in spark 1.0... is there
any chance that somehow you are actually running against a newer
version of Spark?
On Tue, Apr 29, 2014 at 8:58 PM, wxhsdp wrote:
> i met with the same question when update to spark 0.9.1
> (svn checkout https://github.com/apache
i met with the same question when update to spark 0.9.1
(svn checkout https://github.com/apache/spark/)
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.SparkContext$.jarOfClass(Ljava/lang/Class;)Lscala/collection/Seq;
at org.apache.spark.examples.GroupByTest$.main(