[jira] [Created] (SPARK-30357) SparkContext: Invoking stop() from shutdown hook

2019-12-26 Thread Veera (Jira)
Veera created SPARK-30357:
-

 Summary: SparkContext: Invoking stop() from shutdown hook
 Key: SPARK-30357
 URL: https://issues.apache.org/jira/browse/SPARK-30357
 Project: Spark
  Issue Type: Bug
  Components: Deploy, Kubernetes, MLlib, PySpark
Affects Versions: 2.4.4
 Environment: pivotal kubernetes container service PKS  having 3master 
nodes and 3 worker nodes

 

OS : ubuntu for k8-s cluster

 

 
Reporter: Veera


I'm getting below error while running spark-submit job in kubernetes , i didn't 
get any specific error

 

19/12/26 07:38:18 INFO SparkContext: Invoking stop() from shutdown hook
19/12/26 07:38:18 INFO SparkUI: Stopped Spark web UI at 
http://spark-ml-test-1577345808987-driver-svc.spark-jobs.svc:4040
19/12/26 07:38:18 INFO KubernetesClusterSchedulerBackend: Shutting down all 
executors
19/12/26 07:38:18 INFO 
KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Asking each 
executor to shut down
19/12/26 07:38:18 WARN ExecutorPodsWatchSnapshotSource: Kubernetes client has 
been closed (this is expected if the application is shutting down.)
19/12/26 07:38:18 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
19/12/26 07:38:18 INFO MemoryStore: MemoryStore cleared
19/12/26 07:38:18 INFO BlockManager: BlockManager stopped
19/12/26 07:38:18 INFO BlockManagerMaster: BlockManagerMaster stopped
19/12/26 07:38:18 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
19/12/26 07:38:18 INFO SparkContext: Successfully stopped SparkContext
19/12/26 07:38:18 INFO ShutdownHookManager: Shutdown hook called
19/12/26 07:38:18 INFO ShutdownHookManager: Deleting directory 
/var/data/spark-ec98f5d3-022d-4eae-87aa-c23a96c77555/spark-b757b85f-3709-4215-8208-95db840a294a/pyspark-665955c9-0dc6-49ee-9b57-0392be430f10
19/12/26 07:38:18 INFO ShutdownHookManager: Deleting directory 
/tmp/spark-fa047fa8-b573-44aa-ab65-e235fb0d5b09
19/12/26 07:38:18 INFO ShutdownHookManager: Deleting directory 
/var/data/spark-ec98f5d3-022d-4eae-87aa-c23a96c77555/spark-b757b85f-3709-4215-8208-95db840a294a

 

 

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29915) spark-py and spark-r images are not created with docker-image-tool.sh

2019-12-04 Thread Veera (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29915?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16988492#comment-16988492
 ] 

Veera commented on SPARK-29915:
---

Yes this option is not included , but you can add manually in docker files and 
create your own custom docker image

 

> spark-py and spark-r images are not created with docker-image-tool.sh
> -
>
> Key: SPARK-29915
> URL: https://issues.apache.org/jira/browse/SPARK-29915
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 3.0.0
>Reporter: Michał Wesołowski
>Priority: Major
>
> Currently at version 3.0.0-preview docker-image-tool.sh script has the 
> [following 
> lines|[https://github.com/apache/spark/blob/master/bin/docker-image-tool.sh#L173]]
>  defined:
> {code:java}
>  local PYDOCKERFILE=${PYDOCKERFILE:-false} 
>  local RDOCKERFILE=${RDOCKERFILE:-false} {code}
> Because of this change spark-py nor spark-r images get created. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29021) NoSuchElementException: key not found: hostPath.spark-local-dir-5.options.path

2019-12-04 Thread Veera (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29021?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16988487#comment-16988487
 ] 

Veera commented on SPARK-29021:
---

Hi 

 

Even im faing the issue while submiitng the spark

 

/opt/spark/bin/spark-submit --master XXX/ \
--deploy-mode cluster \
--name spark-wordcount \
--conf spark.kubernetes.namespace=spark 
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spar-sa \
--conf spark.executor.instances=3 \
--conf spark.kubernetes.container.image.pullPolicy=Always \
--conf spark.kubernetes.container.image=container image latest version \
--conf spark.kubernetes.pyspark.pythonVersion=3 \
--conf 
spark.kubernetes.authenticate.submission.caCertFile=/etc/kubernetes/pki/ca.crt \
--conf spark.kubernetes.driver.volumes.hostPath.sparkwordcount.mount.path=/root 
\
--conf 
spark.kubernetes.driver.volumes.hostPath.sparkwordcount.mount.readOnly=false \
--conf 
spark.kubernetes.driver.volumes.hostPath.sparkwordcount.options.claimName=spark-wordcount-claim
 \
/root/pyscripts/wordcount.py

 

 

Exception in thread "main" java.util.NoSuchElementException: 
hostPath.sparkwordcount.options.path
 at 
org.apache.spark.deploy.k8s.KubernetesVolumeUtils$MapOps$$anonfun$getTry$1.apply(KubernetesVolumeUtils.scala:107)
 at 
org.apache.spark.deploy.k8s.KubernetesVolumeUtils$MapOps$$anonfun$getTry$1.apply(KubernetesVolumeUtils.scala:107)
 at scala.Option.fold(Option.scala:158)
 at 
org.apache.spark.deploy.k8s.KubernetesVolumeUtils$MapOps.getTry(KubernetesVolumeUtils.scala:107)

> NoSuchElementException: key not found: hostPath.spark-local-dir-5.options.path
> --
>
> Key: SPARK-29021
> URL: https://issues.apache.org/jira/browse/SPARK-29021
> Project: Spark
>  Issue Type: Bug
>  Components: Kubernetes
>Affects Versions: 2.4.4
>Reporter: Kent Yao
>Priority: Major
>
> Mount hostPath has an issue:
> {code:java}
> Exception in thread "main" java.util.NoSuchElementException: key not found: 
> hostPath.spark-local-dir-5.options.pathException in thread "main" 
> java.util.NoSuchElementException: key not found: 
> hostPath.spark-local-dir-5.options.path at 
> scala.collection.MapLike.default(MapLike.scala:235) at 
> scala.collection.MapLike.default$(MapLike.scala:234) at 
> scala.collection.AbstractMap.default(Map.scala:63) at 
> scala.collection.MapLike.apply(MapLike.scala:144) at 
> scala.collection.MapLike.apply$(MapLike.scala:143) at 
> scala.collection.AbstractMap.apply(Map.scala:63) at 
> org.apache.spark.deploy.k8s.KubernetesVolumeUtils$.parseVolumeSpecificConf(KubernetesVolumeUtils.scala:70)
>  at 
> org.apache.spark.deploy.k8s.KubernetesVolumeUtils$.$anonfun$parseVolumesWithPrefix$1(KubernetesVolumeUtils.scala:43)
>  at 
> scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237) at 
> scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:321) at 
> scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:977) at 
> scala.collection.TraversableLike.map(TraversableLike.scala:237) at 
> scala.collection.TraversableLike.map$(TraversableLike.scala:230) at 
> scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:51)
>  at scala.collection.SetLike.map(SetLike.scala:104) at 
> scala.collection.SetLike.map$(SetLike.scala:104) at 
> scala.collection.AbstractSet.map(Set.scala:51) at 
> org.apache.spark.deploy.k8s.KubernetesVolumeUtils$.parseVolumesWithPrefix(KubernetesVolumeUtils.scala:33)
>  at 
> org.apache.spark.deploy.k8s.KubernetesConf$.createDriverConf(KubernetesConf.scala:179)
>  at 
> org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:214)
>  at 
> org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:198)
>  at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:920)
>  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:179) at 
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:202) at 
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:89) at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999) 
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org