The scripts that Xiangrui mentions set up the classpath...Can you run
./run-example for the provided example sucessfully?

What you can try is set SPARK_PRINT_LAUNCH_COMMAND=1 and then call
run-example -- that will show you the exact java command used to run
the example at the start of execution. Assuming you can run examples
succesfully, you should be able to just copy that and add your jar to
the front of the classpath. If that works you can start removing extra
jars (run-examples put all the example jars in the cp, which you won't
need)

 As you said the error you see is indicative of the class not being
available/seen at runtime but it's hard to tell why.

On Wed, Jul 2, 2014 at 2:13 AM, Wanda Hawk <wanda_haw...@yahoo.com> wrote:
> I want to make some minor modifications in the SparkMeans.scala so running
> the basic example won't do.
> I have also packed my code under a "jar" file with sbt. It completes
> successfully but when I try to run it : "java -jar myjar.jar" I get the same
> error:
> "Exception in thread "main" java.lang.NoClassDefFoundError:
> breeze/linalg/Vector
>         at java.lang.Class.getDeclaredMethods0(Native Method)
>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>         at java.lang.Class.getMethod0(Class.java:2774)
>         at java.lang.Class.getMethod(Class.java:1663)
>         at
> sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
>         at
> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
> "
>
> If "scalac -d classes/ SparkKMeans.scala" can't see my classpath, why does
> it succeeds in compiling and does not give the same error ?
> The error itself "NoClassDefFoundError" means that the files are available
> at compile time, but for some reason I cannot figure out they are not
> available at run time. Does anyone know why ?
>
> Thank you
>
>
> On Tuesday, July 1, 2014 7:03 PM, Xiangrui Meng <men...@gmail.com> wrote:
>
>
> You can use either bin/run-example or bin/spark-summit to run example
> code. "scalac -d classes/ SparkKMeans.scala" doesn't recognize Spark
> classpath. There are examples in the official doc:
> http://spark.apache.org/docs/latest/quick-start.html#where-to-go-from-here
> -Xiangrui
>
> On Tue, Jul 1, 2014 at 4:39 AM, Wanda Hawk <wanda_haw...@yahoo.com> wrote:
>> Hello,
>>
>> I have installed spark-1.0.0 with scala2.10.3. I have built spark with
>> "sbt/sbt assembly" and added
>>
>> "/home/wanda/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.0.4.jar"
>> to my CLASSPATH variable.
>> Then I went here
>> "../spark-1.0.0/examples/src/main/scala/org/apache/spark/examples" created
>> a
>> new directory "classes" and compiled SparkKMeans.scala with "scalac -d
>> classes/ SparkKMeans.scala"
>> Then I navigated to "classes" (I commented this line in the scala file :
>> package org.apache.spark.examples ) and tried to run it with "java -cp .
>> SparkKMeans" and I get the following error:
>> "Exception in thread "main" java.lang.NoClassDefFoundError:
>> breeze/linalg/Vector
>>        at java.lang.Class.getDeclaredMethods0(Native Method)
>>        at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
>>        at java.lang.Class.getMethod0(Class.java:2774)
>>        at java.lang.Class.getMethod(Class.java:1663)
>>        at
>> sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
>>        at
>> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
>> Caused by: java.lang.ClassNotFoundException: breeze.linalg.Vector
>>        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>        at java.security.AccessController.doPrivileged(Native Method)
>>        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>        ... 6 more
>> "
>> The jar under
>>
>> "/home/wanda/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.0.4.jar"
>> contains the breeze/linalg/Vector* path, I even tried to unpack it and put
>> it in CLASSPATH to it does not seem to pick it up
>>
>>
>> I am currently running java 1.8
>> "java version "1.8.0_05"
>> Java(TM) SE Runtime Environment (build 1.8.0_05-b13)
>> Java HotSpot(TM) 64-Bit Server VM (build 25.5-b02, mixed mode)"
>>
>> What I am doing wrong ?
>>
>
>

Reply via email to