Thanks Paul.  I'm unable to follow the discussion on SPARK-2075.  But
how would you recommend I test or follow up on that? Is there a
workaround?

On 6/25/14, Paul Brown <p...@mult.ifario.us> wrote:
> Hi, Robert --
>
> I wonder if this is an instance of SPARK-2075:
> https://issues.apache.org/jira/browse/SPARK-2075
>
> -- Paul
>
> —
> p...@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
>
>
> On Wed, Jun 25, 2014 at 6:28 AM, Robert James <srobertja...@gmail.com>
> wrote:
>
>> On 6/24/14, Robert James <srobertja...@gmail.com> wrote:
>> > My app works fine under Spark 0.9.  I just tried upgrading to Spark
>> > 1.0, by downloading the Spark distro to a dir, changing the sbt file,
>> > and running sbt assembly, but I get now NoSuchMethodErrors when trying
>> > to use spark-submit.
>> >
>> > I copied in the SimpleApp example from
>> > http://spark.apache.org/docs/latest/quick-start.html and get the same
>> > error:
>> >
>> > $/usr/local/share/spark/bin/spark-submit --class SimpleApp
>> > target/scala-2.10/myproj-assembly-1.0.jar
>> > Spark assembly has been built with Hive, including Datanucleus jars on
>> > classpath
>> > Exception in thread "main" java.lang.NoSuchMethodError:
>> >
>> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
>> >       at SimpleApp$.main(SimpleApp.scala:10)
>> >       at SimpleApp.main(SimpleApp.scala)
>> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >       at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> >       at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >       at java.lang.reflect.Method.invoke(Method.java:601)
>> >       at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
>> >       at
>> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> >
>> > How can I migrate to Spark 1.0.0?
>> >
>>
>> I've done `sbt clean`, deleted the entire ivy2 cache, and still get
>> the above error on both my code and the official Spark example.  Can
>> anyone guide me on how to debug this?
>>
>> How does Spark find the /usr/local/share/spark directory? Is there a
>> variable somewhere I need to set to point to that, or that might point
>> to the old spark? I've left the old spark dir on the machine (just
>> changed the symlink) - can that be causing problems?
>>
>> How should I approach this?
>>
>

Reply via email to