This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new a569fca2e9b4 [SPARK-46770][K8S][TESTS] Remove legacy
`docker-for-desktop` logic
a569fca2e9b4 is described below
commit a569fca2e9b4b819494e23e0c63bfd69101f02e0
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Jan 19 00:22:25 2024 -0800
[SPARK-46770][K8S][TESTS] Remove legacy `docker-for-desktop` logic
### What changes were proposed in this pull request?
This PR aims to remove legacy `docker-for-desktop` logic in favor of
`docker-desktop`.
### Why are the changes needed?
- Docker Desktop switched the underlying node name and context to
`docker-desktop` in 2020.
- https://github.com/docker/for-win/issues/5089#issuecomment-582752325
- Since Apache Spark 3.2.2, we have been hiding it from the documentation
via SPARK-38272 and now we can delete it.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs and manually test with Docker Desktop.
```
$ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests
-Dtest.exclude.tags=minikube,local
-Dspark.kubernetes.test.deployMode=docker-desktop
"kubernetes-integration-tests/test"
...
[info] KubernetesSuite:
[info] - SPARK-42190: Run SparkPi with local[*] (12 seconds, 759
milliseconds)
[info] - Run SparkPi with no resources (13 seconds, 747 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (19
seconds, 688 milliseconds)
[info] - Run SparkPi with a very long application name. (12 seconds, 436
milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (17 seconds, 411 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (12 seconds, 352
milliseconds)
[info] - Run SparkPi with an argument. (17 seconds, 481 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment
variables. (12 seconds, 375 milliseconds)
[info] - All pods have the same service account by default (17 seconds, 375
milliseconds)
[info] - Run extraJVMOptions check on driver (9 seconds, 362 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - G1GC (12
seconds, 319 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - Other GC (9
seconds, 280 milliseconds
[info] - SPARK-42769: All executor pods have SPARK_DRIVER_POD_IP env
variable (12 seconds, 404 milliseconds)
[info] - Verify logging configuration is picked from the provided
SPARK_CONF_DIR/log4j2.properties (18 seconds, 198 milliseconds)
[info] - Run SparkPi with env and mount secrets. (19 seconds, 463
milliseconds)
[info] - Run PySpark on simple pi.py example (18 seconds, 373 milliseconds)
[info] - Run PySpark to test a pyfiles example (14 seconds, 435
milliseconds)
[info] - Run PySpark with memory customization (17 seconds, 334
milliseconds)
[info] - Run in client mode. (5 seconds, 235 milliseconds)
[info] - Start pod creation from template (12 seconds, 447 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (17 seconds, 351
milliseconds)
[info] - Test basic decommissioning (45 seconds, 365 milliseconds)
[info] - Test basic decommissioning with shuffle cleanup (49 seconds, 679
milliseconds)
[info] - Test decommissioning with dynamic allocation & shuffle cleanups (2
minutes, 52 seconds)
[info] - Test decommissioning timeouts (50 seconds, 379 milliseconds)
[info] - SPARK-37576: Rolling decommissioning (1 minute, 17 seconds)
[info] - Run SparkR on simple dataframe.R example (19 seconds, 453
milliseconds)
[info] YuniKornSuite:
[info] Run completed in 14 minutes, 39 seconds.
[info] Total number of tests run: 27
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 27, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 1078 s (17:58), completed Jan 19, 2024, 12:12:23 AM
```
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #44796 from dongjoon-hyun/SPARK-46770.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../kubernetes/integration-tests/scripts/setup-integration-test-env.sh | 2 +-
.../org/apache/spark/deploy/k8s/integrationtest/PVTestsSuite.scala | 2 +-
.../org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala | 1 -
.../deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala | 2 +-
4 files changed, 3 insertions(+), 4 deletions(-)
diff --git
a/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
b/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
index 721a60b0aef4..48ba96d979fc 100755
---
a/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
+++
b/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
@@ -140,7 +140,7 @@ then
fi
;;
- docker-desktop | docker-for-desktop)
+ docker-desktop)
# Only need to build as this will place it in our local Docker repo
which is all
# we need for Docker for Desktop to work so no need to also push
$SPARK_INPUT_DIR/bin/docker-image-tool.sh -r $IMAGE_REPO -t $IMAGE_TAG
$JAVA_IMAGE_TAG_BUILD_ARG $LANGUAGE_BINDING_BUILD_ARGS build
diff --git
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PVTestsSuite.scala
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PVTestsSuite.scala
index 2745aa7dbe61..fe1a24e3e1e6 100644
---
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PVTestsSuite.scala
+++
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PVTestsSuite.scala
@@ -72,7 +72,7 @@ private[spark] trait PVTestsSuite { k8sSuite: KubernetesSuite
=>
.withMatchExpressions(new NodeSelectorRequirementBuilder()
.withKey("kubernetes.io/hostname")
.withOperator("In")
- .withValues("minikube", "m01", "docker-for-desktop",
"docker-desktop")
+ .withValues("minikube", "m01", "docker-desktop")
.build()).build())
.endRequired()
.endNodeAffinity()
diff --git
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
index bf001666c2e0..2dce5165fc11 100644
---
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
+++
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
@@ -18,7 +18,6 @@ package org.apache.spark.deploy.k8s.integrationtest
object TestConstants {
val BACKEND_MINIKUBE = "minikube"
- val BACKEND_DOCKER_FOR_DESKTOP = "docker-for-desktop"
val BACKEND_DOCKER_DESKTOP = "docker-desktop"
val BACKEND_CLOUD = "cloud"
diff --git
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
index f767f9be55a1..6a35645710a8 100644
---
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
+++
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
@@ -43,7 +43,7 @@ private[spark] object IntegrationTestBackendFactory {
case BACKEND_MINIKUBE => MinikubeTestBackend
case BACKEND_CLOUD =>
new
KubeConfigBackend(System.getProperty(CONFIG_KEY_KUBE_CONFIG_CONTEXT))
- case BACKEND_DOCKER_FOR_DESKTOP | BACKEND_DOCKER_DESKTOP =>
DockerForDesktopBackend
+ case BACKEND_DOCKER_DESKTOP => DockerForDesktopBackend
case _ => throw new IllegalArgumentException("Invalid " +
CONFIG_KEY_DEPLOY_MODE + ": " + deployMode)
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]