This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new 563f4a7  [SPARK-53910] Add `StatefulSet`-based SparkApp example
563f4a7 is described below

commit 563f4a765f166e1db6d1568fd54b133c9c9c30f2
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Tue Oct 14 17:46:46 2025 -0700

    [SPARK-53910] Add `StatefulSet`-based SparkApp example
    
    ### What changes were proposed in this pull request?
    
    This PR aims to add `StatefulSet`-based SparkApp example.
    
    ### Why are the changes needed?
    
    To provide a `StatefulSet` example with 
`spark.kubernetes.allocation.pods.allocator=statefulset`.
    
    Note that this requires the following is a part of `v0.6.0`.
    - #389
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, this is a new example.
    
    ### How was this patch tested?
    
    Manual review.
    
    ```
    $ kubectl apply -f examples/pi-statefulset.yaml
    sparkapplication.spark.apache.org/pi-statefulset created
    ```
    
    ```
    $ kubectl get sparkapp
    NAME             CURRENT STATE    AGE
    pi-statefulset   RunningHealthy   5s
    ```
    
    ```
    $ kubectl get pod
    NAME                                        READY   STATUS    RESTARTS   AGE
    pi-statefulset-0-driver                     1/1     Running   0          7s
    spark-kubernetes-operator-cdb4b547d-5bhfm   1/1     Running   0          80s
    spark-s-pi-statefulset-0-0-0                1/1     Running   0          5s
    spark-s-pi-statefulset-0-0-1                1/1     Running   0          5s
    spark-s-pi-statefulset-0-0-2                1/1     Running   0          5s
    ```
    
    ```
    $ kubectl get sparkapp
    NAME             CURRENT STATE      AGE
    pi-statefulset   ResourceReleased   48s
    ```
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #390 from dongjoon-hyun/SPARK-53910.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 examples/pi-statefulset.yaml | 31 +++++++++++++++++++++++++++++++
 1 file changed, 31 insertions(+)

diff --git a/examples/pi-statefulset.yaml b/examples/pi-statefulset.yaml
new file mode 100644
index 0000000..e7ac510
--- /dev/null
+++ b/examples/pi-statefulset.yaml
@@ -0,0 +1,31 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+apiVersion: spark.apache.org/v1
+kind: SparkApplication
+metadata:
+  name: pi-statefulset
+spec:
+  mainClass: "org.apache.spark.examples.SparkPi"
+  driverArgs: ["20000"]
+  jars: "local:///opt/spark/examples/jars/spark-examples.jar"
+  sparkConf:
+    spark.executor.instances: "3"
+    spark.kubernetes.allocation.pods.allocator: "statefulset"
+    spark.kubernetes.authenticate.driver.serviceAccountName: "spark"
+    spark.kubernetes.container.image: "apache/spark:4.0.1"
+  applicationTolerations:
+    resourceRetainPolicy: OnFailure
+  runtimeVersions:
+    sparkVersion: "4.0.1"


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to