This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new 2021ff6  [SPARK-52608] Make `README.md` up-to-date
2021ff6 is described below

commit 2021ff6fc4f6299e707eda4b95fbf806280d4ddc
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Sat Jun 28 14:59:28 2025 -0700

    [SPARK-52608] Make `README.md` up-to-date
    
    ### What changes were proposed in this pull request?
    
    This PR aims to make `README.md` up-to-date.
    
    ### Why are the changes needed?
    
    To match with https://apache.github.io/spark-kubernetes-operator/ ,
    - Helm commands are updated.
    - `Spark Cluster` example is regenerated.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No behavior change.
    
    ### How was this patch tested?
    
    Manual review.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #263 from dongjoon-hyun/SPARK-52608.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 README.md | 18 +++++++++---------
 1 file changed, 9 insertions(+), 9 deletions(-)

diff --git a/README.md b/README.md
index b998b38..a55c218 100644
--- a/README.md
+++ b/README.md
@@ -17,9 +17,9 @@ Apache Spark provides a Helm Chart.
 - 
<https://artifacthub.io/packages/helm/spark-kubernetes-operator/spark-kubernetes-operator/>
 
 ```bash
-helm repo add spark-kubernetes-operator 
https://apache.github.io/spark-kubernetes-operator
+helm repo add spark https://apache.github.io/spark-kubernetes-operator
 helm repo update
-helm install spark-kubernetes-operator 
spark-kubernetes-operator/spark-kubernetes-operator
+helm install spark spark/spark-kubernetes-operator
 ```
 
 ## Building Spark K8s Operator
@@ -75,21 +75,21 @@ $ kubectl port-forward prod-master-0 6066 &
 $ ./examples/submit-pi-to-prod.sh
 {
   "action" : "CreateSubmissionResponse",
-  "message" : "Driver successfully submitted as driver-20240821181327-0000",
+  "message" : "Driver successfully submitted as driver-20250628212324-0000",
   "serverSparkVersion" : "4.0.0",
-  "submissionId" : "driver-20240821181327-0000",
+  "submissionId" : "driver-20250628212324-0000",
   "success" : true
 }
 
-$ curl http://localhost:6066/v1/submissions/status/driver-20240821181327-0000/
+$ curl http://localhost:6066/v1/submissions/status/driver-20250628212324-0000/
 {
   "action" : "SubmissionStatusResponse",
   "driverState" : "FINISHED",
   "serverSparkVersion" : "4.0.0",
-  "submissionId" : "driver-20240821181327-0000",
+  "submissionId" : "driver-20250628212324-0000",
   "success" : true,
-  "workerHostPort" : "10.1.5.188:42099",
-  "workerId" : "worker-20240821181236-10.1.5.188-42099"
+  "workerHostPort" : "10.1.0.88:34643",
+  "workerId" : "worker-20250628212306-10.1.0.88-34643"
 }
 
 $ kubectl delete sparkcluster prod
@@ -146,7 +146,7 @@ No resources found in default namespace.
 Remove HelmChart and CRDs.
 
 ```bash
-helm uninstall spark-kubernetes-operator
+helm uninstall spark
 
 kubectl delete crd sparkapplications.spark.apache.org
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to