Hi Matei,

Good to hear from you.  The stack trace is below.  I launched the instances
with --spark-version=0.8.0 and verified that the version was correct by
launching spark-shell.  Also verified that the version I've got in my
project is 0.8.0.  Nothing else should have changed, as the scripts I use
to set up the classpath and everything is the exact same as I used in 0.7.3.

Cheers,


java.lang.Exception: Could not find resource path for Web UI:
org/apache/spark/ui/static
    at
org.apache.spark.ui.JettyUtils$.createStaticHandler(JettyUtils.scala:89)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:40)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:122)
    at walrusthecat.ml.spark.SparkSVM$.main(SparkSVM.scala:16)
    at walrusthecat.ml.spark.SparkSVM.main(SparkSVM.scala)
    at walrusthecat.ml.spark.Main.main(Main.java:7)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)
    at
scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)
    at
scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)
    at
scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)
    at
scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
    at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)
    at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)
    at
scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:60)
    at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)
    at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
    at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)




On Wed, Nov 27, 2013 at 6:15 PM, Matei Zaharia <[email protected]>wrote:

> Sorry, what’s the full context for this? Do you have a stack trace? My
> guess is that Spark isn’t on your classpath, or maybe you only have an old
> version of it on there.
>
> Matei
>
> On Nov 27, 2013, at 6:04 PM, Walrus theCat <[email protected]> wrote:
>
> To clarify, I just undid that "var... field.." thing described above, and
> it throws the same error.
>
>
> On Wed, Nov 27, 2013 at 5:53 PM, Walrus theCat <[email protected]>wrote:
>
>> Hi all,
>>
>> This exception gets thrown when I assign a value to the variable holding
>> my SparkContext.  I initialize it as a var holding a null value (so it can
>> be a field), and then give it a value in my main method.  This worked with
>> the previous version of Spark, but is not working on Spark 0.8.0.
>>
>> Dankeschöen,
>>
>> Walrus theCat
>>
>
>
>

Reply via email to