Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-28 Thread Nikhil Chinnapa
Thanks for explaining in such detail and pointing to the source code.
Yes, its helpful and cleared lot of confusions. 



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-27 Thread Nikhil Chinnapa
Hi Stavros,

Thanks a lot for pointing in right direction. I got stuck in some release,
so didn’t got time earlier.

The mistake was “LINUX_APP_RESOURCE” : I was using “local” instead it should
be “file”. I reached above due to your email only.

What I understood:
Driver image :  $SPARK_HOME/bin and $SPARK_HOME/jars and application jar.
Executor Image : just $SPARK_HOME/bin and $SPARK_HOME/jars folder will
suffice.




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



K8s-Spark client mode : Executor image not able to download application jar from driver

2019-04-16 Thread Nikhil Chinnapa
Environment:
Spark: 2.4.0
Kubernetes:1.14

Query: Does application jar needs to be part of both Driver and Executor
image?

Invocation point (from Java code):
sparkLaunch = new SparkLauncher()

.setMaster(LINUX_MASTER)
.setAppResource(LINUX_APP_RESOURCE)
.setConf("spark.app.name",APP_NAME)
.setMainClass(MAIN_CLASS)

.setConf("spark.executor.instances",EXECUTOR_COUNT)

.setConf("spark.kubernetes.container.image",CONTAINER_IMAGE)

.setConf("spark.kubernetes.driver.pod.name",DRIVER_POD_NAME)
   
.setConf("spark.kubernetes.container.image.pullSecrets",REGISTRY_SECRET)
   
.setConf("spark.kubernetes.authenticate.driver.serviceAccountName",SERVICE_ACCOUNT_NAME)
.setConf("spark.driver.host", SERVICE_NAME + 
"." + NAMESPACE +
".svc.cluster.local")
.setConf("spark.driver.port", 
DRIVER_PORT)
.setDeployMode("client")
;

Scenario:
I am trying to run Spark on K8s in client mode. When I put application jar
image both in driver and executor then program work fines.

But, if I put application jar in driver image only then I get following
error:

2019-04-16 06:36:44 INFO  Executor:54 - Fetching
file:/opt/spark/examples/jars/reno-spark-codebase-0.1.0.jar with timestamp
1555396592768
2019-04-16 06:36:44 INFO  Utils:54 - Copying
/opt/spark/examples/jars/reno-spark-codebase-0.1.0.jar to
/var/data/spark-d24c8fbc-4fe7-4968-9310-f891a097d1e7/spark-31ba5cbb-3132-408c-991a-795
2019-04-16 06:36:44 ERROR Executor:91 - Exception in task 0.1 in stage 0.0
(TID 2)
java.nio.file.NoSuchFileException:
/opt/spark/examples/jars/reno-spark-codebase-0.1.0.jar
at
java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:92)
at
java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at
java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:116)
at java.base/sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:548)
at
java.base/sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:254)
at java.base/java.nio.file.Files.copy(Files.java:1294)
at
org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:664)
at org.apache.spark.util.Utils$.copyFile(Utils.scala:635)
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:719)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:496)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:805)
at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:797)
at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:130)
at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:797)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:369)
at
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)






--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org