Yeah -- you must be right, but literally nothing else changes between the
code that works and the code that doesn't work (including scripts to
construct the classpath) except the version of Spark.  But yeah, clearly.


On Thu, Dec 5, 2013 at 8:34 PM, Mark Hamstra <[email protected]>wrote:

> Which, again, clearly indicates that you have classpath issues.
>
>
> On Thu, Dec 5, 2013 at 11:00 AM, Walrus theCat <[email protected]>wrote:
>
>> Update on this... it works when I run the main class with the
>> ./run-example script, but not in any form of scala myjar.jar .
>>
>>
>> On Mon, Dec 2, 2013 at 7:14 PM, Walrus theCat <[email protected]>wrote:
>>
>>> Anyone have any ideas based on the stack trace?
>>>
>>> Thanks
>>>
>>>
>>> On Sun, Dec 1, 2013 at 9:09 PM, Walrus theCat <[email protected]>wrote:
>>>
>>>> Shouldn't?  I imported the new 0.8.0 jars into my build path, and had
>>>> to update my imports accordingly.  The only way I upload the spark jars
>>>> myself is that they get packaged into my executable jar.  The cluster
>>>> should have the right version based on the flag used to launch it (and it
>>>> does.)
>>>>
>>>>
>>>> On Fri, Nov 29, 2013 at 10:12 PM, Ashish Rangole <[email protected]>wrote:
>>>>
>>>>> I am sure you have already checked this, any chance the classpath has
>>>>> v 0.7.x jars in it?
>>>>> On Nov 29, 2013 4:40 PM, "Walrus theCat" <[email protected]>
>>>>> wrote:
>>>>>
>>>>>> The "full context" isn't much -- this is the first thing I do in my
>>>>>> main method (assign a value to sc), and it throws this error.
>>>>>>
>>>>>>
>>>>>> On Fri, Nov 29, 2013 at 10:38 AM, Walrus theCat <
>>>>>> [email protected]> wrote:
>>>>>>
>>>>>>> Hi Matei,
>>>>>>>
>>>>>>> Good to hear from you.  The stack trace is below.  I launched the
>>>>>>> instances with --spark-version=0.8.0 and verified that the version was
>>>>>>> correct by launching spark-shell.  Also verified that the version I've 
>>>>>>> got
>>>>>>> in my project is 0.8.0.  Nothing else should have changed, as the 
>>>>>>> scripts I
>>>>>>> use to set up the classpath and everything is the exact same as I used 
>>>>>>> in
>>>>>>> 0.7.3.
>>>>>>>
>>>>>>> Cheers,
>>>>>>>
>>>>>>>
>>>>>>> java.lang.Exception: Could not find resource path for Web UI:
>>>>>>> org/apache/spark/ui/static
>>>>>>>     at
>>>>>>> org.apache.spark.ui.JettyUtils$.createStaticHandler(JettyUtils.scala:89)
>>>>>>>     at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:40)
>>>>>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:122)
>>>>>>>     at walrusthecat.ml.spark.SparkSVM$.main(SparkSVM.scala:16)
>>>>>>>     at walrusthecat.ml.spark.SparkSVM.main(SparkSVM.scala)
>>>>>>>     at walrusthecat.ml.spark.Main.main(Main.java:7)
>>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>     at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>     at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>     at
>>>>>>> org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
>>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>>     at
>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>>>     at
>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>>>     at
>>>>>>> scala.tools.nsc.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:78)
>>>>>>>     at
>>>>>>> scala.tools.nsc.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:24)
>>>>>>>     at
>>>>>>> scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:88)
>>>>>>>     at
>>>>>>> scala.tools.nsc.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:78)
>>>>>>>     at
>>>>>>> scala.tools.nsc.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
>>>>>>>     at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:33)
>>>>>>>     at
>>>>>>> scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:40)
>>>>>>>     at
>>>>>>> scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:60)
>>>>>>>     at
>>>>>>> scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:80)
>>>>>>>     at
>>>>>>> scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
>>>>>>>     at
>>>>>>> scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Nov 27, 2013 at 6:15 PM, Matei Zaharia <
>>>>>>> [email protected]> wrote:
>>>>>>>
>>>>>>>> Sorry, what’s the full context for this? Do you have a stack trace?
>>>>>>>> My guess is that Spark isn’t on your classpath, or maybe you only have 
>>>>>>>> an
>>>>>>>> old version of it on there.
>>>>>>>>
>>>>>>>> Matei
>>>>>>>>
>>>>>>>> On Nov 27, 2013, at 6:04 PM, Walrus theCat <[email protected]>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>> To clarify, I just undid that "var... field.." thing described
>>>>>>>> above, and it throws the same error.
>>>>>>>>
>>>>>>>>
>>>>>>>> On Wed, Nov 27, 2013 at 5:53 PM, Walrus theCat <
>>>>>>>> [email protected]> wrote:
>>>>>>>>
>>>>>>>>> Hi all,
>>>>>>>>>
>>>>>>>>> This exception gets thrown when I assign a value to the variable
>>>>>>>>> holding my SparkContext.  I initialize it as a var holding a null 
>>>>>>>>> value (so
>>>>>>>>> it can be a field), and then give it a value in my main method.  This
>>>>>>>>> worked with the previous version of Spark, but is not working on Spark
>>>>>>>>> 0.8.0.
>>>>>>>>>
>>>>>>>>> Dankeschöen,
>>>>>>>>>
>>>>>>>>> Walrus theCat
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>
>>>
>>
>

Reply via email to