This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new ba291a619042 [SPARK-55509][K8S][DOCS] Update `YuniKorn` docs with
`1.8.0`
ba291a619042 is described below
commit ba291a61904283f03b279fd7ebd1c7cd1b7767b8
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Feb 12 18:33:53 2026 -0800
[SPARK-55509][K8S][DOCS] Update `YuniKorn` docs with `1.8.0`
### What changes were proposed in this pull request?
This PR aims to recommend `YuniKorn` v1.8.0 for Apache Spark 4.2.0
documentation.
### Why are the changes needed?
Apache YuniKorn v1.8.0 is a new feature release with 93 patches.
- https://yunikorn.apache.org/release-announce/1.8.0 (2026-02-03)
- https://issues.apache.org/jira/projects/YUNIKORN/versions/12355110
- https://github.com/apache/yunikorn-k8shim/pull/990
- https://github.com/apache/yunikorn-core/pull/1046
### Does this PR introduce _any_ user-facing change?
### How was this patch tested?
Pass the CIs and manual test.
I installed YuniKorn v1.8.0 on K8s 1.35 and tested manually.
**K8s v1.35**
```
$ kubectl version
Client Version: v1.35.0
Kustomize Version: v5.7.1
Server Version: v1.35.0+k3s3
```
**YuniKorn v1.8.0**
```
$ helm list -n yunikorn
NAME NAMESPACE REVISION UPDATED
STATUS CHART APP VERSION
yunikorn yunikorn 1 2026-02-12 16:35:00.201861
-0800 PST deployed yunikorn-1.8.0
```
```
$ build/sbt -Pkubernetes -Pkubernetes-integration-tests
-Dspark.kubernetes.test.deployMode=rancher-desktop
"kubernetes-integration-tests/testOnly *.YuniKornSuite"
-Dtest.exclude.tags=minikube,local,decom,r -Dtest.default.exclude.tags=
...
[info] YuniKornSuite:
[info] - SPARK-42190: Run SparkPi with local[*] (8 seconds, 675
milliseconds)
[info] - Run SparkPi with no resources (12 seconds, 571 milliseconds)
[info] - SPARK-53944: Run SparkPi without driver service (10 seconds, 429
milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (10
seconds, 472 milliseconds)
[info] - Run SparkPi with a very long application name. (11 seconds, 461
milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (10 seconds, 455 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (12 seconds, 596
milliseconds)
[info] - Run SparkPi with an argument. (11 seconds, 577 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment
variables. (10 seconds, 431 milliseconds)
[info] - All pods have the same service account by default (11 seconds, 437
milliseconds)
[info] - Run extraJVMOptions check on driver (6 seconds, 272 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - G1GC (7
seconds, 498 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - Other GC (6
seconds, 285 milliseconds)
[info] - SPARK-42769: All executor pods have SPARK_DRIVER_POD_IP env
variable (11 seconds, 503 milliseconds)
[info] - Verify logging configuration is picked from the provided
SPARK_CONF_DIR/log4j2.properties (12 seconds, 290 milliseconds)
[info] - Run SparkPi with env and mount secrets. (15 seconds, 277
milliseconds)
[info] - Run PySpark on simple pi.py example (11 seconds, 525 milliseconds)
[info] - Run PySpark to test a pyfiles example (13 seconds, 780
milliseconds)
[info] - Run PySpark with memory customization (10 seconds, 529
milliseconds)
[info] - Run PySpark with Spark Connect !!! IGNORED !!!
[info] - Run in client mode. (6 seconds, 230 milliseconds)
[info] - Start pod creation from template (12 seconds, 537 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (11 seconds, 463
milliseconds)
[info] - A driver-only Spark job with a tmpfs-backed localDir volume (8
seconds, 615 milliseconds)
[info] - A driver-only Spark job with a tmpfs-backed emptyDir data volume
(8 seconds, 879 milliseconds)
[info] - A driver-only Spark job with a disk-backed emptyDir volume (7
seconds, 663 milliseconds)
[info] - A driver-only Spark job with an OnDemand PVC volume (13 seconds,
186 milliseconds)
[info] - A Spark job with tmpfs-backed localDir volumes (10 seconds, 323
milliseconds)
[info] - A Spark job with two executors with OnDemand PVC volumes (15
seconds, 780 milliseconds)
...
[info] Tests: succeeded 28, failed 0, canceled 2, ignored 2, pending 0
[info] All tests passed.
[success] Total time: 527 s (0:08:47.0), completed Feb 12, 2026, 6:32:10 PM
```
### Was this patch authored or co-authored using generative AI tooling?
Generated-by: `Gemini 3 Pro (High)` on `Antigravity`
Closes #54294 from dongjoon-hyun/SPARK-55509.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
docs/running-on-kubernetes.md | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/docs/running-on-kubernetes.md b/docs/running-on-kubernetes.md
index 873814e2becd..b41cad1bebd0 100644
--- a/docs/running-on-kubernetes.md
+++ b/docs/running-on-kubernetes.md
@@ -2034,10 +2034,10 @@ Install Apache YuniKorn:
```bash
helm repo add yunikorn https://apache.github.io/yunikorn-release
helm repo update
-helm install yunikorn yunikorn/yunikorn --namespace yunikorn --version 1.7.0
--create-namespace --set embedAdmissionController=false
+helm install yunikorn yunikorn/yunikorn --namespace yunikorn --version 1.8.0
--create-namespace --set embedAdmissionController=false
```
-The above steps will install YuniKorn v1.7.0 on an existing Kubernetes cluster.
+The above steps will install YuniKorn v1.8.0 on an existing Kubernetes cluster.
##### Get started
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]