It is definitely set correctly in spark-env.sh. Not sure what's
happening...


On Fri, Nov 22, 2013 at 3:57 PM, Patrick Wendell <[email protected]> wrote:

> It looks like scala is not on the classpath. Make sure SCALA_HOME is
> set correctly in spark-env.sh
>
> On Thu, Nov 21, 2013 at 6:54 PM, Umar Javed <[email protected]> wrote:
> > Sorry if this seems abusing this list, but any idea what may be going on
> > here?
> >
> > thanks!
> >
> >
> > On Thu, Nov 21, 2013 at 12:55 PM, Umar Javed <[email protected]>
> wrote:
> >>
> >> Thanks. Here's the stderr output from the worker. Any ideas (the
> >> compilation seemed to go fine):
> >>
> >> Spark Executor Command: "java" "-cp"
> >>
> ":/proj/UW-PCP/incubator-spark/conf:/proj/UW-PCP/incubator-spark/assembly/target/scala-2.9.3/spark-assembly-0.9.0-incubating-SNAPSHOT-hadoop1.0.4.jar"
> >> "-Xms512M" "-Xmx512M" "\
> >> org.apache.spark.executor.CoarseGrainedExecutorBackend"
> >> "akka://spark@node0-link0:35799/user/CoarseGrainedScheduler" "2"
> >> "node0-link0" "2" "app-20131121135049-0001"
> >> ========================================
> >>
> >> Exception in thread "main" java.lang.NoClassDefFoundError:
> >> scala/ScalaObject
> >>         at java.lang.ClassLoader.defineClass1(Native Method)
> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> >>         at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> >>         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>         at java.lang.ClassLoader.defineClass1(Native Method)
> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> >>         at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> >>         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>         at
> >> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
> >> Caused by: java.lang.ClassNotFoundException: scala.ScalaObject
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>         ... 25 more
> >>
> >>
> >>
> >> On Thu, Nov 21, 2013 at 5:48 AM, Prashant Sharma <[email protected]>
> >> wrote:
> >>>
> >>> You might wanna check the stderr and stdout located in work(where
> >>> standalone puts log of Executors ) directory.
> >>>
> >>>
> >>> On Thu, Nov 21, 2013 at 7:07 PM, Umar Javed <[email protected]>
> >>> wrote:
> >>>>
> >>>> I have a really simple standalone cluster with one worker located on
> the
> >>>> same machine as the master. Both master and worker launch OK with the
> >>>> scripts provided in /conf. However when I run the spark shell with the
> >>>> command: MASTER=.... ./spark-shell, my worker fails to launch. Here's
> a
> >>>> section of the log output:
> >>>>
> >>>> 13/11/21 06:32:34 INFO SparkDeploySchedulerBackend: Executor
> >>>> app-20131121063231-0000/4 removed: Command exited with code 1
> >>>> 13/11/21 06:32:34 INFO Client$ClientActor: Executor added:
> >>>> app-20131121063231-0000/5 on worker-20131121063035-node0-link0-52768
> >>>> (node0-link0:7077) with 2 cores
> >>>> 13/11/21 06:32:34 INFO SparkDeploySchedulerBackend: Granted executor
> ID
> >>>> app-20131121063231-0000/5 on hostPort node0-link0:7077 with 2 cores,
> 512.0
> >>>> MB RAM
> >>>> 13/11/21 06:32:34 INFO Client$ClientActor: Executor updated:
> >>>> app-20131121063231-0000/5 is now RUNNING
> >>>> 13/11/21 06:32:34 INFO Client$ClientActor: Executor updated:
> >>>> app-20131121063231-0000/5 is now FAILED (Command exited with code 1)
> >>>> 13/11/21 06:32:34 INFO SparkDeploySchedulerBackend: Executor
> >>>> app-20131121063231-0000/5 removed: Command exited with code 1
> >>>>
> >>>>
> >>>> Basically the executor on the worker keeps getting failed as soon as
> it
> >>>> is launched.
> >>>> Anybody have a solution?
> >>>>
> >>>> thanks!
> >>>> Umar
> >>>
> >>>
> >>>
> >>>
> >>> --
> >>> s
> >>
> >>
> >
>

Reply via email to