This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new 90e4777  [SPARK-53639] Use `spark` consistently for `release-name` of 
Helm installation
90e4777 is described below

commit 90e4777046d09f1c8964ddf1c7c6f1e2c4c478f2
Author: Dongjoon Hyun <dongj...@apache.org>
AuthorDate: Thu Sep 18 11:38:04 2025 -0700

    [SPARK-53639] Use `spark` consistently for `release-name` of Helm 
installation
    
    ### What changes were proposed in this pull request?
    
    This PR aims to use `spark` consistently for `release-name` of Helm 
installation.
    
    ### Why are the changes needed?
    
    Although we use `spark` in `README.md` and website like the following, old 
legacy names still exist.
    
    
https://github.com/apache/spark-kubernetes-operator/blob/cfd5063c595c445eee3d9d94e5dc9d17590ed5d4/README.md?plain=1#L49
    
    We need to unify this. Otherwise, `INSTALLATION FAILED` occurs due to the 
mismatch of `meta.helm.sh/release-name`.
    
    ```
    $ helm install sparkx --create-namespace -f 
build-tools/helm/spark-kubernetes-operator/values.yaml 
build-tools/helm/spark-kubernetes-operator/
    Error: INSTALLATION FAILED: Unable to continue with install: ServiceAccount 
"spark-operator" in namespace "default" exists and cannot be imported into the 
current release: invalid ownership metadata; annotation validation error: key 
"meta.helm.sh/release-name" must equal "sparkx": current value is "spark"
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No for the clean installations.
    
    For the users who have the existing installation, they need to remove old 
K8s objects before installation.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #323 from dongjoon-hyun/SPARK-53639.
    
    Authored-by: Dongjoon Hyun <dongj...@apache.org>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 .github/workflows/build_and_test.yml            | 6 +++---
 docs/configuration.md                           | 2 +-
 docs/operations.md                              | 2 +-
 tests/e2e/watched-namespaces/chainsaw-test.yaml | 2 +-
 4 files changed, 6 insertions(+), 6 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 589607b..94f090c 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -125,8 +125,8 @@ jobs:
         run: |
           eval $(minikube docker-env)
           ./gradlew buildDockerImage
-          helm install spark-kubernetes-operator --create-namespace -f 
build-tools/helm/spark-kubernetes-operator/values.yaml 
build-tools/helm/spark-kubernetes-operator/
-          helm test spark-kubernetes-operator
+          helm install spark --create-namespace -f 
build-tools/helm/spark-kubernetes-operator/values.yaml 
build-tools/helm/spark-kubernetes-operator/
+          helm test spark
           # Use remote host' s docker image
           minikube docker-env --unset
       - name: Run E2E Test with Dynamic Configuration Disabled
@@ -138,7 +138,7 @@ jobs:
         run: |
           eval $(minikube docker-env)
           ./gradlew buildDockerImage
-          helm install spark-kubernetes-operator --create-namespace -f \
+          helm install spark --create-namespace -f \
           build-tools/helm/spark-kubernetes-operator/values.yaml -f \
           tests/e2e/helm/dynamic-config-values.yaml \
           build-tools/helm/spark-kubernetes-operator/
diff --git a/docs/configuration.md b/docs/configuration.md
index b957285..f471b02 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -122,7 +122,7 @@ sink.PrometheusPullModelSink
 * Install Spark Operator
 
 ```bash
-helm install spark-kubernetes-operator -f 
build-tools/helm/spark-kubernetes-operator/values.yaml 
build-tools/helm/spark-kubernetes-operator/
+helm install spark -f build-tools/helm/spark-kubernetes-operator/values.yaml 
build-tools/helm/spark-kubernetes-operator/
 ```
 
 * Install Prometheus via Helm Chart
diff --git a/docs/operations.md b/docs/operations.md
index 7646dce..433e127 100644
--- a/docs/operations.md
+++ b/docs/operations.md
@@ -50,7 +50,7 @@ You can also provide multiple custom values file by using the 
`-f` flag, the lat
 higher precedence:
 
 ```bash
-helm install spark-kubernetes-operator \
+helm install spark \
    -f build-tools/helm/spark-kubernetes-operator/values.yaml \
    -f my_values.yaml \
    build-tools/helm/spark-kubernetes-operator/
diff --git a/tests/e2e/watched-namespaces/chainsaw-test.yaml 
b/tests/e2e/watched-namespaces/chainsaw-test.yaml
index a9d5a58..95d4a00 100644
--- a/tests/e2e/watched-namespaces/chainsaw-test.yaml
+++ b/tests/e2e/watched-namespaces/chainsaw-test.yaml
@@ -76,7 +76,7 @@ spec:
       - script:
           content: |
             echo "Installing another spark operator in default-2 namespaces, 
watching on namespace: spark-3"
-            helm install spark-kubernetes-operator -n default-2 
--create-namespace -f \
+            helm install spark -n default-2 --create-namespace -f \
             ../../../build-tools/helm/spark-kubernetes-operator/values.yaml -f 
\
             ../helm/dynamic-config-values-2.yaml \
             ../../../build-tools/helm/spark-kubernetes-operator/


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to