Ajit-mycom opened a new issue, #41454:
URL: https://github.com/apache/airflow/issues/41454

   ### Official Helm Chart version
   
   1.13.1
   
   ### Apache Airflow version
   
   2.8.1
   
   ### Kubernetes Version
   
   1.29
   
   ### Helm Chart configuration
   
   _No response_
   
   ### Docker Image customizations
   
   _No response_
   
   ### What happened
   
   I have deployed Apache Airflow with the multiNamespaceMode: true setting, 
which resulted in the creation of a ClusterRole with
   
   permissions limited to pods only. However, I need to add permissions for 
sparkapplications in the API group sparkoperator.k8s.io
   
   because I am encountering the following error:
   
   cannot create resource "sparkapplications" in API group 
"sparkoperator.k8s.io" 
   
   I explored two potential solutions:
   
   Manual Role Editing: I found that manually editing or patching the 
ClusterRole is not permitted, which does not provide a
   
   sustainable solution.
   
   Local Chart Modification: While modifying the ClusterRole in a local Airflow 
chart could resolve the issue, it introduces complexity
   
   as it necessitates using a custom chart for upgrades and feature management.
   
   Given these constraints, I am seeking guidance from the Airflow team on how 
to address this issue. Is there an alternative
   
   approach to manage sparkapplications permissions within the current 
deployment setup? Your assistance would be greatly
   
   appreciated.
   
   ### What you think should happen instead
   
   there must be clusterrole and clusterrolebinding for sparkapplications 
currently we have only for pods which works with multinamespace = true
   
   ### How to reproduce
   
   I have deployed Apache Airflow with the multiNamespaceMode: true setting, 
which resulted in the creation of a ClusterRole with
   
   permissions limited to pods only. However, I need to add permissions for 
sparkapplications in the API group sparkoperator.k8s.io
   
   because I am encountering the following error:
   
   cannot create resource "sparkapplications" in API group 
"sparkoperator.k8s.io" 
   
   I explored two potential solutions:
   
   Manual Role Editing: I found that manually editing or patching the 
ClusterRole is not permitted, which does not provide a
   
   sustainable solution.
   
   Local Chart Modification: While modifying the ClusterRole in a local Airflow 
chart could resolve the issue, it introduces complexity
   
   as it necessitates using a custom chart for upgrades and feature management.
   
   Given these constraints, I am seeking guidance from the Airflow team on how 
to address this issue. Is there an alternative
   
   approach to manage sparkapplications permissions within the current 
deployment setup? Your assistance would be greatly
   
   appreciated.
   
   ### Anything else
   
   No
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to