This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new 9423208  [SPARK-53853] Add `Example` section in `operators.md`
9423208 is described below

commit 9423208438a565a6f680993c398c84ba1371c2d9
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Oct 9 00:00:05 2025 -0700

    [SPARK-53853] Add `Example` section in `operators.md`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to add `Example` section in `operators.md`.
    
    ### Why are the changes needed?
    
    To illustrate how to install and maintain multiple operators based on the 
workloads.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No. This is a documentation addition.
    
    ### How was this patch tested?
    
    Manual review.
    
    - 
https://github.com/dongjoon-hyun/spark-kubernetes-operator/blob/SPARK-53853/docs/operations.md#example
    
      - **OUTLINE**
        <img width="311" height="279" alt="Screenshot 2025-10-08 at 22 13 14" 
src="https://github.com/user-attachments/assets/f7c5b5d5-c99c-4db2-b10d-8a959205dca2";
 />
    
      - **SECTION**
        <img width="1029" height="633" alt="Screenshot 2025-10-08 at 22 23 44" 
src="https://github.com/user-attachments/assets/aac7f785-7c8d-48fa-9775-9776216529b5";
 />
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #380 from dongjoon-hyun/SPARK-53853.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 docs/operations.md | 35 +++++++++++++++++++++++++++++++++++
 1 file changed, 35 insertions(+)

diff --git a/docs/operations.md b/docs/operations.md
index 22c8b21..ad2fd10 100644
--- a/docs/operations.md
+++ b/docs/operations.md
@@ -155,3 +155,38 @@ metadata:
   labels:
     "spark.operator/sentinel": "true"
 ```
+
+## Example
+
+Install HelmChart at `us-west-1` and `us-west-2` namespaces.
+
+```bash
+helm install us-west-1 spark/spark-kubernetes-operator --create-namespace 
--namespace us-west-1 --set 
operatorRbac.clusterRole.name=spark-operator-clusterrole-us-west-1 --set 
operatorRbac.clusterRoleBinding.name=spark-operator-clusterrolebinding-us-west-1
 --set workloadResources.clusterRole.name=spark-workload-clusterrole-us-west-1
+```
+
+```bash
+helm install us-west-2 spark/spark-kubernetes-operator --create-namespace 
--namespace us-west-2 --set 
operatorRbac.clusterRole.name=spark-operator-clusterrole-us-west-2 --set 
operatorRbac.clusterRoleBinding.name=spark-operator-clusterrolebinding-us-west-2
 --set workloadResources.clusterRole.name=spark-workload-clusterrole-us-west-2
+```
+
+Check installation.
+
+```bash
+$ helm list -A
+NAME      NAMESPACE REVISION UPDATED                              STATUS   
CHART                           APP VERSION
+us-west-1 us-west-1 1        2025-10-08 22:04:45.530136 -0700 PDT deployed 
spark-kubernetes-operator-1.3.0 0.5.0
+us-west-2 us-west-2 1        2025-10-08 22:04:48.747434 -0700 PDT deployed 
spark-kubernetes-operator-1.3.0 0.5.0
+```
+
+Launch `pi.yaml` at `us-west-1` and `us-west-2` namespaces.
+
+```bash
+kubectl apply -f https://apache.github.io/spark-kubernetes-operator/pi.yaml -n 
us-west-1
+kubectl apply -f https://apache.github.io/spark-kubernetes-operator/pi.yaml -n 
us-west-2
+```
+
+```bash
+$ kubectl get sparkapp -A
+NAMESPACE   NAME   CURRENT STATE    AGE
+us-west-1   pi     RunningHealthy   8s
+us-west-2   pi     RunningHealthy   3s
+```


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to