Hi there,

I'm trying out Spark Job Server (REST) to submit jobs to spark cluster. I
believe that my problem is unrelated to this specific software, but
otherwise generic issue with missing jars on paths. So every application
implements the trait with SparkJob class:

/object LongPiJob extends SparkJob {
../

SparkJob class is available through the jar file, built by Spark Job Server
Scala application. When I run all this with local Spark cluster, everything
works fine after I add the export line into spark-env.sh:

/export SPARK_CLASSPATH=$SPARK_HOME/job-server/spark-job-server.jar/

However, when I do the same on Spark cluster on EC2, I get the errors:

        /java.lang.NoClassDefFoundError: spark/jobserver/SparkJob/

I've added the path in spark-env.sh (on remote Spark master Amazon machine):

/export MASTER=`cat /root/spark-ec2/cluster-url`

*export SPARK_CLASSPATH=/root/spark/job-server/spark-job-server.jar*

export
SPARK_SUBMIT_LIBRARY_PATH="$SPARK_SUBMIT_LIBRARY_PATH:/root/ephemeral-hdfs/lib/native/"
export
SPARK_SUBMIT_CLASSPATH="$SPARK_CLASSPATH:$SPARK_SUBMIT_CLASSPATH:/root/ephemeral-hdfs/conf"/

Also, when I run ./bin/compute-classpath.sh, I can see the required jar,
defining "missing" class at the first place:

/bin]$ ./compute-classpath.sh 
Spark assembly has been built with Hive, including Datanucleus jars on
classpath
/root/spark/job-server/spark-job-server.jar:/root/spark/job-server/spark-job-server.jar::/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/lib/spark-assembly-1.2.0-hadoop1.0.4.jar:/root/spark/lib/datanucleus-core-3.2.10.jar:/root/spark/lib/datanucleus-api-jdo-3.2.6.jar:/root/spark/lib/datanucleus-rdbms-3.2.9.jar/


What am I missing? I'd greatly appreciate your help




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Submitting-jobs-on-Spark-EC2-cluster-class-not-found-even-if-it-s-on-CLASSPATH-tp21864.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to