dongjoon-hyun commented on a change in pull request #35364:
URL: https://github.com/apache/spark/pull/35364#discussion_r818455216



##########
File path: project/SparkBuild.scala
##########
@@ -671,9 +670,9 @@ object KubernetesIntegrationTests {
     (Test / javaOptions) ++= Seq(
       
s"-Dspark.kubernetes.test.deployMode=${deployMode.getOrElse("minikube")}",
       s"-Dspark.kubernetes.test.imageTag=${imageTag.value}",
-      s"-Dspark.kubernetes.test.namespace=${namespace.value}",
       s"-Dspark.kubernetes.test.unpackSparkDir=$sparkHome"
     ),
+    (Test / javaOptions) ++= 
namespace.map("-Dspark.kubernetes.test.namespace=" + _),

Review comment:
       This change was intentional, @martin-g .
   - This new default is consistent with `Maven` integration test's default 
test coverage which uses random namespaces.
   - Last time, we missed the namespace propagation issue because SBT only 
tested with a single `default` namespace at that time. I fixed that later via 
ed3ea989f97634374789d173f1cc932230bf3aa1 .
   
   After this change, we have the same test coverage and behavior in Maven/SBT.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to