This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.2 by this push:
new 416f86d [SPARK-38272][K8S][TESTS] Use `docker-desktop` instead of
`docker-for-desktop` for Docker K8S IT deployMode and context name
416f86d is described below
commit 416f86dd82deec105b2acde4c85704288e7e6c56
Author: Yikun Jiang <[email protected]>
AuthorDate: Tue Feb 22 18:40:01 2022 -0800
[SPARK-38272][K8S][TESTS] Use `docker-desktop` instead of
`docker-for-desktop` for Docker K8S IT deployMode and context name
### What changes were proposed in this pull request?
Change `docker-for-desktop` to `docker-desktop`.
### Why are the changes needed?
The context name of the kubernetes on docker for desktop should be
`docker-desktop` rather than `docker-for-desktop`
```
$ k config current-context
docker-desktop
```
According to the
[comments](https://github.com/docker/for-win/issues/5089#issuecomment-582752325),
since docker desktop v2.4 (current is v4.5.1), `docker` are using use a alias
`docker-for-desktop` to link `docker-desktop` cluster for legacy.
See also here:
https://github.com/apache/spark/pull/35557#issuecomment-1046609601 .
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- CI passed
- build/sbt -Dspark.kubernetes.test.deployMode=docker-for-desktop -Pvolcano
-Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=minikube,r
"kubernetes-integration-tests/test"
- build/sbt -Dspark.kubernetes.test.deployMode=docker-desktop -Pvolcano
-Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=minikube,r
"kubernetes-integration-tests/test"
Closes #35595 from Yikun/SPARK-38272.
Authored-by: Yikun Jiang <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit ceb32c9e67b5e9456c4f82366d9695bb59e32762)
Signed-off-by: Dongjoon Hyun <[email protected]>
---
resource-managers/kubernetes/integration-tests/README.md | 8 ++++----
.../integration-tests/scripts/setup-integration-test-env.sh | 2 +-
.../apache/spark/deploy/k8s/integrationtest/TestConstants.scala | 1 +
.../k8s/integrationtest/backend/IntegrationTestBackend.scala | 2 +-
.../integrationtest/backend/docker/DockerForDesktopBackend.scala | 2 +-
5 files changed, 8 insertions(+), 7 deletions(-)
diff --git a/resource-managers/kubernetes/integration-tests/README.md
b/resource-managers/kubernetes/integration-tests/README.md
index 9a0689f..9d3efa4 100644
--- a/resource-managers/kubernetes/integration-tests/README.md
+++ b/resource-managers/kubernetes/integration-tests/README.md
@@ -44,11 +44,11 @@ Uses the local `minikube` cluster, this requires that
`minikube` 1.7.3 or greate
at least 4 CPUs and 6GB memory (some users have reported success with as few
as 3 CPUs and 4GB memory). The tests will
check if `minikube` is started and abort early if it isn't currently running.
-### `docker-for-desktop`
+### `docker-desktop`
Since July 2018 Docker for Desktop provide an optional Kubernetes cluster that
can be enabled as described in this
[blog
post](https://blog.docker.com/2018/07/kubernetes-is-now-available-in-docker-desktop-stable-channel/).
Assuming
-this is enabled using this backend will auto-configure itself from the
`docker-for-desktop` context that Docker creates
+this is enabled using this backend will auto-configure itself from the
`docker-desktop` context that Docker creates
in your `~/.kube/config` file. If your config file is in a different location
you should set the `KUBECONFIG`
environment variable appropriately.
@@ -132,7 +132,7 @@ properties to Maven. For example:
-Dspark.kubernetes.test.imageTag=sometag \
-Dspark.kubernetes.test.imageRepo=docker.io/somerepo \
-Dspark.kubernetes.test.namespace=spark-int-tests \
-
-Dspark.kubernetes.test.deployMode=docker-for-desktop \
+ -Dspark.kubernetes.test.deployMode=docker-desktop \
-Dtest.include.tags=k8s
@@ -165,7 +165,7 @@ to the wrapper scripts and using the wrapper scripts will
simply set these appro
<td><code>spark.kubernetes.test.deployMode</code></td>
<td>
The integration test backend to use. Acceptable values are
<code>minikube</code>,
- <code>docker-for-desktop</code> and <code>cloud</code>.
+ <code>docker-desktop</code> and <code>cloud</code>.
<td><code>minikube</code></td>
</tr>
<tr>
diff --git
a/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
b/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
index beda56c..685b986 100755
---
a/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
+++
b/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
@@ -127,7 +127,7 @@ then
fi
;;
- docker-for-desktop)
+ docker-desktop | docker-for-desktop)
# Only need to build as this will place it in our local Docker repo
which is all
# we need for Docker for Desktop to work so no need to also push
$SPARK_INPUT_DIR/bin/docker-image-tool.sh -r $IMAGE_REPO -t $IMAGE_TAG
$JAVA_IMAGE_TAG_BUILD_ARG $LANGUAGE_BINDING_BUILD_ARGS build
diff --git
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
index 2b1fd08..c46839f 100644
---
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
+++
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/TestConstants.scala
@@ -19,6 +19,7 @@ package org.apache.spark.deploy.k8s.integrationtest
object TestConstants {
val BACKEND_MINIKUBE = "minikube"
val BACKEND_DOCKER_FOR_DESKTOP = "docker-for-desktop"
+ val BACKEND_DOCKER_DESKTOP = "docker-desktop"
val BACKEND_CLOUD = "cloud"
val CONFIG_KEY_DEPLOY_MODE = "spark.kubernetes.test.deployMode"
diff --git
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
index 36c3b6a..ced8151 100644
---
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
+++
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
@@ -43,7 +43,7 @@ private[spark] object IntegrationTestBackendFactory {
case BACKEND_MINIKUBE => MinikubeTestBackend
case BACKEND_CLOUD =>
new
KubeConfigBackend(System.getProperty(CONFIG_KEY_KUBE_CONFIG_CONTEXT))
- case BACKEND_DOCKER_FOR_DESKTOP => DockerForDesktopBackend
+ case BACKEND_DOCKER_FOR_DESKTOP | BACKEND_DOCKER_DESKTOP =>
DockerForDesktopBackend
case _ => throw new IllegalArgumentException("Invalid " +
CONFIG_KEY_DEPLOY_MODE + ": " + deployMode)
}
diff --git
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
index 81a11ae..f206bef 100644
---
a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
+++
b/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/docker/DockerForDesktopBackend.scala
@@ -20,6 +20,6 @@ import
org.apache.spark.deploy.k8s.integrationtest.TestConstants
import
org.apache.spark.deploy.k8s.integrationtest.backend.cloud.KubeConfigBackend
private[spark] object DockerForDesktopBackend
- extends KubeConfigBackend(TestConstants.BACKEND_DOCKER_FOR_DESKTOP) {
+ extends KubeConfigBackend(TestConstants.BACKEND_DOCKER_DESKTOP) {
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]