Hi All,

We're trying to run spark with mesos and docker in client mode (since mesos
doesn't support cluster mode) and load the application Jar from HDFS.  The
following is the command we're running:

We're getting the following warning before an exception from that command:


Before I debug further, is this even supported?  I started reading the code
and it wasn't clear that it's possible to load a remote jar in client mode
at all.  I did see a related issue in [2] but it didn't quite clarify
everything I was looking for.

Thanks,
- Alan

[1] https://spark.apache.org/docs/latest/submitting-applications.html

[2]
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-not-working-when-application-jar-is-in-hdfs-td21840.html



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-fails-when-jar-is-in-HDFS-tp24163.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to