Hi,I'm trying to create application that would programmatically submit jar
file to Spark standalone cluster running on my local PC. However, I'm always
getting the error WARN  TaskSetManager:66 - Lost task 1.0 in stage 0.0 (TID
1, 192.168.2.68, executor 0): java.lang.RuntimeException: Stream
'/jars/sample-spark-maven-one-jar.jar' was not found.I'm creating the
SparkContext in the following way:val sparkConf = new SparkConf() 
sparkConf.setMaster("spark://zoran-Latitude-E5420:7077") 
sparkConf.set("spark.cores_max","2") 
sparkConf.set("spark.executor.memory","2g") 
sparkConf.set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer")  sparkConf.setAppName("Test
application")  sparkConf.set("spark.ui.port","4041") 
sparkConf.set("spark.local.ip","192.168.2.68")  val
oneJar="/samplesparkmaven/target/sample-spark-maven-one-jar.jar" 
sparkConf.setJars(List(oneJar))  val sc = new SparkContext(sparkConf)I'm
using Spark 2.1.0 in standalone mode with master and one worker. Does anyone
have idea where the problem might be or how to investigate it
further?Thanks,Zoran



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Worker-can-t-find-jar-submitted-programmatically-tp28398.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to