This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new c1a5e26d21e [SPARK-42150][K8S][DOCS] Upgrade `Volcano` to 1.7.0
c1a5e26d21e is described below
commit c1a5e26d21ec0b985bdcc48cb64c5203a6c79957
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Sat Jan 21 16:35:05 2023 -0800
[SPARK-42150][K8S][DOCS] Upgrade `Volcano` to 1.7.0
### What changes were proposed in this pull request?
This PR aims to upgrade `Volcano` from 1.5.1 to 1.7.0.
### Why are the changes needed?
Volcano 1.7.0 finally provides `multi-arch image` and `K8s 1.25`.
- https://github.com/volcano-sh/volcano/releases/tag/v1.7.0
### Does this PR introduce _any_ user-facing change?
No. This is a doc-only change.
### How was this patch tested?
Manually verify with `Volcano 1.7.0`.
```
$ build/sbt -Psparkr -Pkubernetes -Pvolcano -Pkubernetes-integration-tests
-Dtest.exclude.tags=minikube,local -Dtest.include.tags=volcano
-Dspark.kubernetes.test.deployMode=docker-desktop
"kubernetes-integration-tests/test"
...
[info] VolcanoSuite:
[info] - Run SparkPi with volcano scheduler (8 seconds, 331 milliseconds)
[info] - SPARK-38187: Run SparkPi Jobs with minCPU (30 seconds, 563
milliseconds)
[info] - SPARK-38187: Run SparkPi Jobs with minMemory (27 seconds, 505
milliseconds)
[info] - SPARK-38188: Run SparkPi jobs with 2 queues (only 1 enabled) (12
seconds, 141 milliseconds)
[info] - SPARK-38188: Run SparkPi jobs with 2 queues (all enabled) (21
seconds, 247 milliseconds)
[info] - SPARK-38423: Run driver job to validate priority order (21
seconds, 516 milliseconds)
[info] YuniKornSuite:
[info] Run completed in 2 minutes, 37 seconds.
[info] Total number of tests run: 6
[info] Suites: completed 3, aborted 0
[info] Tests: succeeded 6, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 412 s (06:52), completed Jan 21, 2023, 8:04:01 AM
```
Closes #39690 from dongjoon-hyun/SPARK-42150.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
resource-managers/kubernetes/integration-tests/README.md | 16 ++--------------
1 file changed, 2 insertions(+), 14 deletions(-)
diff --git a/resource-managers/kubernetes/integration-tests/README.md
b/resource-managers/kubernetes/integration-tests/README.md
index 82e4de9bade..c3bce2415e4 100644
--- a/resource-managers/kubernetes/integration-tests/README.md
+++ b/resource-managers/kubernetes/integration-tests/README.md
@@ -339,11 +339,7 @@ Volcano integration is experimental in Aapche Spark 3.3.0
and the test coverage
## Installation
- # x86_64
- kubectl apply -f
https://raw.githubusercontent.com/volcano-sh/volcano/v1.5.1/installer/volcano-development.yaml
-
- # arm64:
- kubectl apply -f
https://raw.githubusercontent.com/volcano-sh/volcano/v1.5.1/installer/volcano-development-arm64.yaml
+ kubectl apply -f
https://raw.githubusercontent.com/volcano-sh/volcano/v1.7.0/installer/volcano-development.yaml
## Run tests
@@ -364,13 +360,5 @@ You can also specify `volcano` tag to only run Volcano
test:
## Cleanup Volcano
- # x86_64
- kubectl delete -f
https://raw.githubusercontent.com/volcano-sh/volcano/v1.5.1/installer/volcano-development.yaml
-
- # arm64:
- kubectl delete -f
https://raw.githubusercontent.com/volcano-sh/volcano/v1.5.1/installer/volcano-development-arm64.yaml
-
- # Cleanup Volcano webhook
- kubectl delete validatingwebhookconfigurations
volcano-admission-service-jobs-validate volcano-admission-service-pods-validate
volcano-admission-service-queues-validate
- kubectl delete mutatingwebhookconfigurations
volcano-admission-service-jobs-mutate
volcano-admission-service-podgroups-mutate
volcano-admission-service-pods-mutate volcano-admission-service-queues-mutate
+ kubectl delete -f
https://raw.githubusercontent.com/volcano-sh/volcano/v1.7.0/installer/volcano-development.yaml
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]