Hi,
I have a 3 node installation of hadoop 2.2.0 with yarn. I have installed
spark-0.8.1 with support for spark enabled. I get the following errors when
trying to run the examples:
SPARK_JAR=./assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.0.5-alpha.jar
\
./spark-class org.apache.spark.deploy.yarn.Client \
--jar
examples/target/scala-2.9.3/spark-examples-assembly-0.8.0-incubating.jar
\
--class org.apache.spark.examples.SparkPi \
--args yarn-standalone \
--num-workers 3 \
--master-memory 4g \
--worker-memory 2g \
--worker-cores 1
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/deploy/yarn/Client
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.deploy.yarn.Client
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
Could not find the main class: org.apache.spark.deploy.yarn.Client. Program
will exit.
spark-0.8.0 with hadooop 2.0.5-alpha works fine.
--
/Izhar