Harshwardhan Singh Dodiya created SPARK-44050:
-------------------------------------------------

             Summary: Unable to Mount ConfigMap in Driver Pod - ConfigMap 
Creation Issue
                 Key: SPARK-44050
                 URL: https://issues.apache.org/jira/browse/SPARK-44050
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes, Spark Submit
    Affects Versions: 3.3.1
            Reporter: Harshwardhan Singh Dodiya


Dear Spark community,

I am facing an issue related to mounting a ConfigMap in the driver pod of my 
Spark application. Upon investigation, I realized that the problem is caused by 
the ConfigMap not being created successfully.

*Problem Description:*
When attempting to mount the ConfigMap in the driver pod, I encounter 
consistent failures and my pod stays in containerCreating state. Upon further 
investigation, I discovered that the ConfigMap does not exist in the Kubernetes 
cluster, which results in the driver pod's inability to access the required 
configuration data.

*Additional Information:*

I would like to highlight that this issue is not a frequent occurrence. It has 
been observed randomly, affecting the mounting of the ConfigMap in the driver 
pod only approximately 5% of the time. This intermittent behavior adds 
complexity to the troubleshooting process, as it is challenging to reproduce 
consistently.

*Error Message:*

when describing driver pod (kubectl describe pod pod_name)  get the below error.

"ConfigMap '<configmap-name>' not found."

*To Reproduce:*

1. Download spark 3.3.1 from [https://spark.apache.org/downloads.html]

2. create an image with "bin/docker-image-tool.sh"

3. Submit on spark-client via bash command by passing all the details and 
configurations.

4. Randomly in some of the driver pod we can observe this issue.

 

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to