This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-kubernetes-operator.git


The following commit(s) were added to refs/heads/main by this push:
     new f596723  [SPARK-55563] Add `JWSFilter`-enabled Spark History Server 
example
f596723 is described below

commit f5967231fda67cd0ae5c213d85a835c5f1d246bc
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Mon Feb 16 17:13:40 2026 -0800

    [SPARK-55563] Add `JWSFilter`-enabled Spark History Server example
    
    ### What changes were proposed in this pull request?
    
    This PR aims to add `JWSFilter`-enabled Spark History Server example.
    
    ### Why are the changes needed?
    
    To provide an example to use built-in `JWSFilter` to provide more secure 
Web UI access.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Manual review.
    
    **Launch SHS server with JWSFilter**
    ```
    $ kubectl apply -f examples/spark-history-server-with-jws-filter.yaml
    $ kubectl port-forward 
svc/spark-history-server-with-jws-filter-0-driver-svc 18080
    ```
    
    **Use `curl` to visit**
    ```
    $ curl -I http://localhost:18080/
    HTTP/1.1 403 Forbidden
    Date: Mon, 16 Feb 2026 23:36:54 GMT
    Cache-Control: must-revalidate,no-cache,no-store
    Content-Type: text/html;charset=iso-8859-1
    Content-Length: 472
    
    $ curl -v -H "Authorization: Bearer 
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.e30.4EKWlOkobpaAPR0J4BE0cPQ-ZD1tRQKLZp1vtE7upPw"
 http://localhost:180
    80/
    * Host localhost:18080 was resolved.
    * IPv6: ::1
    * IPv4: 127.0.0.1
    *   Trying [::1]:18080...
    * Connected to localhost (::1) port 18080
    > GET / HTTP/1.1
    > Host: localhost:18080
    > User-Agent: curl/8.7.1
    > Accept: */*
    > Authorization: Bearer 
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.e30.4EKWlOkobpaAPR0J4BE0cPQ-ZD1tRQKLZp1vtE7upPw
    >
    * Request completely sent off
    < HTTP/1.1 200 OK
    ```
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    Generated-by: `Gemini 3 Pro (High)` on `Antigravity`
    
    Closes #506 from dongjoon-hyun/SPARK-55563.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 examples/spark-history-server-with-jws-filter.yaml | 47 ++++++++++++++++++++++
 1 file changed, 47 insertions(+)

diff --git a/examples/spark-history-server-with-jws-filter.yaml 
b/examples/spark-history-server-with-jws-filter.yaml
new file mode 100644
index 0000000..0746924
--- /dev/null
+++ b/examples/spark-history-server-with-jws-filter.yaml
@@ -0,0 +1,47 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+apiVersion: spark.apache.org/v1
+kind: SparkApplication
+metadata:
+  name: spark-history-server-with-jws-filter
+spec:
+  mainClass: "org.apache.spark.deploy.history.HistoryServer"
+  sparkConf:
+    spark.jars.packages: "org.apache.hadoop:hadoop-aws:3.4.2"
+    spark.jars.ivy: "/tmp/.ivy2.5.2"
+    spark.driver.memory: "2g"
+    spark.kubernetes.authenticate.driver.serviceAccountName: "spark"
+    spark.kubernetes.container.image: "apache/spark:{{SPARK_VERSION}}-scala"
+    spark.ui.port: "18080"
+    spark.history.fs.logDirectory: "s3a://spark-events"
+    spark.history.fs.cleaner.enabled: "true"
+    spark.history.fs.cleaner.maxAge: "30d"
+    spark.history.fs.cleaner.maxNum: "100"
+    spark.history.fs.eventLog.rolling.maxFilesToRetain: "10"
+    spark.hadoop.fs.defaultFS: "s3a://spark-events"
+    spark.hadoop.fs.s3a.endpoint: "http://localstack:4566";
+    spark.hadoop.fs.s3a.path.style.access: "true"
+    spark.hadoop.fs.s3a.access.key: "test"
+    spark.hadoop.fs.s3a.secret.key: "test"
+    spark.kubernetes.driver.pod.excludedFeatureSteps: 
"org.apache.spark.deploy.k8s.features.KerberosConfDriverFeatureStep"
+    # JWS Filter Configuration
+    spark.ui.filters: "org.apache.spark.ui.JWSFilter"
+    spark.org.apache.spark.ui.JWSFilter.param.secretKey: 
"VmlzaXQgaHR0cHM6Ly9zcGFyay5hcGFjaGUub3JnIHRvIGRvd25sb2FkIEFwYWNoZSBTcGFyay4="
+  runtimeVersions:
+    sparkVersion: "4.1.1"
+  applicationTolerations:
+    restartConfig:
+      restartPolicy: Always
+      maxRestartAttempts: 9223372036854775807


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to