Hi All,

We're trying to run spark with mesos and docker in client mode (since mesos
doesn't support cluster mode) and load the application Jar from HDFS.  The
following is the command we're running:

/usr/local/spark/bin/spark-submit --master mesos://mesos.master:5050 --conf
spark.mesos.executor.docker.image=docker.repo/spark:latest --class
org.apache.spark.examples.SparkPi
hdfs://hdfs1/tmp/spark-examples-1.4.1-hadoop2.6.0-cdh5.4.4.jar 100

We're getting the following warning before an exception from that command:

Warning: Skip remote jar
hdfs://hdfs1/tmp/spark-examples-1.4.1-hadoop2.6.0-cdh5.4.4.jar.
java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi

Before I debug further, is this even supported?  I started reading the code
and it wasn't clear that it's possible to load a remote jar in client mode
at all.  I did see a related issue in [2] but it didn't quite clarify
everything I was looking for.

Thanks,
- Alan

[1] https://spark.apache.org/docs/latest/submitting-applications.html

[2]
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-not-working-when-application-jar-is-in-hdfs-td21840.html

Reply via email to