[GitHub] spark pull request #21652: [SPARK-24551][K8S] Add integration tests for secr...

2018-07-02 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/21652#discussion_r199610045
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -74,10 +76,12 @@ private[spark] class KubernetesSuite extends 
SparkFunSuite
 testBackend = IntegrationTestBackendFactory.getTestBackend
 testBackend.initialize()
 kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+createTestSecret()
   }
 
   override def afterAll(): Unit = {
 testBackend.cleanUp()
+deleteTestSecret()
--- End diff --

Same question about every test versus just one of them.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21652: [SPARK-24551][K8S] Add integration tests for secr...

2018-07-02 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/21652#discussion_r199609986
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -74,10 +76,12 @@ private[spark] class KubernetesSuite extends 
SparkFunSuite
 testBackend = IntegrationTestBackendFactory.getTestBackend
 testBackend.initialize()
 kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+createTestSecret()
--- End diff --

Should this be done before every test, or just the one that is using the 
secrets?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21652: [SPARK-24551][K8S] Add integration tests for secr...

2018-07-02 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/21652#discussion_r199609501
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -265,6 +323,37 @@ private[spark] class KubernetesSuite extends 
SparkFunSuite
 assert(envVars("ENV2") === "VALUE2")
   }
 
+  private def executeCommand(cmd: String*)(implicit podName: String): 
String = {
+val out = new ByteArrayOutputStream()
+val watch = kubernetesTestComponents
+  .kubernetesClient
+  .pods()
+  .withName(podName)
+  .readingInput(System.in)
+  .writingOutput(out)
+  .writingError(System.err)
+  .withTTY()
+  .exec(cmd.toArray: _*)
+// wait to get some result back
+Thread.sleep(1000)
--- End diff --

This is a pretty quick timeout. Is there some way to have a longer timeout 
but to return instantly if the watch command exits? E.g. some kind of "join" 
method call on the watch instance that itself has a timeout?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21672: [SPARK-24694][K8S] Pass all app args to integration test...

2018-06-29 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21672
  
BTW, for committers - I think this patch is good to merge.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21672: [SPARK-24694][K8S] Pass all app args to integration test...

2018-06-29 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21672
  
I see why the old behavior was there. I made a minimal change to some 
existing code to fix a bug:


https://github.com/ssuchter/spark/commit/1d8a265d13b65dcec8db11a5be09d4a029037d2c

but this new way is better.

Question: In this new way, do we even need the test for 
appArguments.appArgs.length > 0? Could we just use appArguments.appArgs?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21672: [SPARK-24694][K8S] Pass all app args to integration test...

2018-06-29 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21672
  
So this changes behavior, I think. In the old behavior, if the args were 
['a', 'b'] then you'd get a single arg of ['a b'] passed through, and with this 
you'd get ['a', 'b'].

This new behavior seems better, I'm just trying a bit to remember why we 
had the old behavior.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-20 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-15 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-15 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-15 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-12 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-12 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k9s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
another comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
Retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
Comment


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
Another commnet


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21517: Testing k8s change - please ignore (13)

2018-06-11 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21517
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #21517: Testing k8s change - please ignore (13)

2018-06-08 Thread ssuchter
GitHub user ssuchter opened a pull request:

https://github.com/apache/spark/pull/21517

Testing k8s change - please ignore (13)

Please ignore this change - testing k8s pull request building.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ssuchter/spark testing-ssuchter-prb-branch13

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/21517.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #21517


commit 35d9e2568ad963c7f93aba2cf6e34911933bf2f0
Author: Sean Suchter 
Date:   2018-06-09T00:10:09Z

Testing prb




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-08 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r194150705
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,294 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-08 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r194150614
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/Utils.scala
 ---
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.Closeable
+import java.net.URI
+
+import org.apache.spark.internal.Logging
+
+object Utils extends Logging {
+
+  def tryWithResource[R <: Closeable, T](createResource: => R)(f: R => T): 
T = {
+val resource = createResource
+try f.apply(resource) finally resource.close()
+  }
+
+  def tryWithSafeFinally[T](block: => T)(finallyBlock: => Unit): T = {
--- End diff --

I agree. I removed it. I did a series of greps and removed things that 
were't used.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-08 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r194150624
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/Utils.scala
 ---
@@ -0,0 +1,88 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.Closeable
+import java.net.URI
+
+import org.apache.spark.internal.Logging
+
+object Utils extends Logging {
+
+  def tryWithResource[R <: Closeable, T](createResource: => R)(f: R => T): 
T = {
+val resource = createResource
+try f.apply(resource) finally resource.close()
+  }
+
+  def tryWithSafeFinally[T](block: => T)(finallyBlock: => Unit): T = {
+var originalThrowable: Throwable = null
+try {
+  block
+} catch {
+  case t: Throwable =>
+// Purposefully not using NonFatal, because even fatal exceptions
+// we don't want to have our finallyBlock suppress
+originalThrowable = t
+throw originalThrowable
+} finally {
+  try {
+finallyBlock
+  } catch {
+case t: Throwable =>
+  if (originalThrowable != null) {
+originalThrowable.addSuppressed(t)
+logWarning(s"Suppressing exception in finally: " + 
t.getMessage, t)
+throw originalThrowable
+  } else {
+throw t
+  }
+  }
+}
+  }
+
+  def checkAndGetK8sMasterUrl(rawMasterURL: String): String = {
--- End diff --

I agree. I removed it. I did a series of greps and removed things that 
were't used.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-08 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r194133234
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,294 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-07 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r193957378
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,231 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-06-07 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Hi folks - I don't think there is any more work to do on this PR. Is there 
something you're waiting for before merging it?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-06-01 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
@mccheah If you want to merge, and then I can fix the commented out test in 
another PR, that's ok too.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-01 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r192488928
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,231 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-06-01 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
@skonto I'll test that and discuss with @shaneknapp. It wouldn't involve 
directly changing code in this PR, since the minikube creation/destruction is 
done by Jenkins job config, but it is relevant to how we set up systems. We 
could potentially accelerate the adoption of all nodes. I'm not sure about 
whether the docker system is ready on all the nodes or not, but that's a good 
discussion.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-01 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r192466840
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,231 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-01 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r192452101
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,231 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-06-01 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r192447027
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,231 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import org.apache.spark.SparkFunSuite
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends SparkFunSuite
+  with BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-31 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Correct. I will change the K8s integration tests to do exactly what you 
describe above, after this is merged. @vanzin do you want to comment? One of 
the two of you will need to do the merge, of course.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-30 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191966223
  
--- Diff: 
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ---
@@ -0,0 +1,91 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
+UNPACKED_SPARK_TGZ="$TEST_ROOT_DIR/target/spark-dist-unpacked"
+IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
+DEPLOY_MODE="minikube"
+IMAGE_REPO="docker.io/kubespark"
+IMAGE_TAG="N/A"
+SPARK_TGZ="N/A"
+
+# Parse arguments
+while (( "$#" )); do
+  case $1 in
+--unpacked-spark-tgz)
+  UNPACKED_SPARK_TGZ="$2"
+  shift
+  ;;
+--image-repo)
+  IMAGE_REPO="$2"
+  shift
+  ;;
+--image-tag)
+  IMAGE_TAG="$2"
+  shift
+  ;;
+--image-tag-output-file)
+  IMAGE_TAG_OUTPUT_FILE="$2"
+  shift
+  ;;
+--deploy-mode)
+  DEPLOY_MODE="$2"
+  shift
+  ;;
+--spark-tgz)
+  SPARK_TGZ="$2"
+  shift
+  ;;
+*)
+  break
+  ;;
+  esac
+  shift
+done
+
+if [[ $SPARK_TGZ == "N/A" ]];
+then
+  echo "Must specify a Spark tarball to build Docker images against with 
--spark-tgz." && exit 1;
--- End diff --

Ok, it's important for me to be clear here. There are currently two PRBs. 
This will continue in the immediate future.

1. General Spark PRB, mainly for unit tests. This can run on all hosts.

2. K8s integration-specific PRB. This early-outs on many PRs that don't 
seem relevant. This is specifically for running K8s integration tests, and can 
only run on some hosts.

Because of the host restriction issue, these are two separate PRBs.

It is definitely true that each one of these will build the main Spark jars 
separately, so that 11 minute time will be spent twice. Since the 
K8s-integration PRB is only doing this on a small set of PRs, it's not a 
significant cost to the Jenkins infrastructure.

Within the K8s-integration PRB, the entire maven reactor is only built 
once, during the make distribution step. The integration test step doesn't 
rebuild it.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-30 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191962980
  
--- Diff: 
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ---
@@ -0,0 +1,91 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
+UNPACKED_SPARK_TGZ="$TEST_ROOT_DIR/target/spark-dist-unpacked"
+IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
+DEPLOY_MODE="minikube"
+IMAGE_REPO="docker.io/kubespark"
+IMAGE_TAG="N/A"
+SPARK_TGZ="N/A"
+
+# Parse arguments
+while (( "$#" )); do
+  case $1 in
+--unpacked-spark-tgz)
+  UNPACKED_SPARK_TGZ="$2"
+  shift
+  ;;
+--image-repo)
+  IMAGE_REPO="$2"
+  shift
+  ;;
+--image-tag)
+  IMAGE_TAG="$2"
+  shift
+  ;;
+--image-tag-output-file)
+  IMAGE_TAG_OUTPUT_FILE="$2"
+  shift
+  ;;
+--deploy-mode)
+  DEPLOY_MODE="$2"
+  shift
+  ;;
+--spark-tgz)
+  SPARK_TGZ="$2"
+  shift
+  ;;
+*)
+  break
+  ;;
+  esac
+  shift
+done
+
+if [[ $SPARK_TGZ == "N/A" ]];
+then
+  echo "Must specify a Spark tarball to build Docker images against with 
--spark-tgz." && exit 1;
--- End diff --

I timed the build on my laptop. To build the Spark jars took just over 11 
minutes. To build the .tgz took about 7 seconds. So this extra step adds ~1% 
overhead.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-30 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191961749
  
--- Diff: 
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ---
@@ -0,0 +1,91 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
+UNPACKED_SPARK_TGZ="$TEST_ROOT_DIR/target/spark-dist-unpacked"
+IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
+DEPLOY_MODE="minikube"
+IMAGE_REPO="docker.io/kubespark"
+IMAGE_TAG="N/A"
+SPARK_TGZ="N/A"
+
+# Parse arguments
+while (( "$#" )); do
+  case $1 in
+--unpacked-spark-tgz)
+  UNPACKED_SPARK_TGZ="$2"
+  shift
+  ;;
+--image-repo)
+  IMAGE_REPO="$2"
+  shift
+  ;;
+--image-tag)
+  IMAGE_TAG="$2"
+  shift
+  ;;
+--image-tag-output-file)
+  IMAGE_TAG_OUTPUT_FILE="$2"
+  shift
+  ;;
+--deploy-mode)
+  DEPLOY_MODE="$2"
+  shift
+  ;;
+--spark-tgz)
+  SPARK_TGZ="$2"
+  shift
+  ;;
+*)
+  break
+  ;;
+  esac
+  shift
+done
+
+if [[ $SPARK_TGZ == "N/A" ]];
+then
+  echo "Must specify a Spark tarball to build Docker images against with 
--spark-tgz." && exit 1;
--- End diff --

Also - Jenkins has *already* been doing that - building the distribution 
.tgz for each Kubernetes related PRB invokation. Relevant is the fact that the 
filtering of what is a Kubernetes-related change happens *before* the 
distribution .tgz is done.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-30 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191961317
  
--- Diff: 
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ---
@@ -0,0 +1,91 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
+UNPACKED_SPARK_TGZ="$TEST_ROOT_DIR/target/spark-dist-unpacked"
+IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
+DEPLOY_MODE="minikube"
+IMAGE_REPO="docker.io/kubespark"
+IMAGE_TAG="N/A"
+SPARK_TGZ="N/A"
+
+# Parse arguments
+while (( "$#" )); do
+  case $1 in
+--unpacked-spark-tgz)
+  UNPACKED_SPARK_TGZ="$2"
+  shift
+  ;;
+--image-repo)
+  IMAGE_REPO="$2"
+  shift
+  ;;
+--image-tag)
+  IMAGE_TAG="$2"
+  shift
+  ;;
+--image-tag-output-file)
+  IMAGE_TAG_OUTPUT_FILE="$2"
+  shift
+  ;;
+--deploy-mode)
+  DEPLOY_MODE="$2"
+  shift
+  ;;
+--spark-tgz)
+  SPARK_TGZ="$2"
+  shift
+  ;;
+*)
+  break
+  ;;
+  esac
+  shift
+done
+
+if [[ $SPARK_TGZ == "N/A" ]];
+then
+  echo "Must specify a Spark tarball to build Docker images against with 
--spark-tgz." && exit 1;
--- End diff --

The act of building the .tgz is much cheaper (faster) than doing the java 
build or the integration test. I wouldn't worry about that.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-30 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Any more issues that should be addressed now? Is this ready for merge?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-30 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191855713
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/IntegrationTestBackend.scala
 ---
@@ -0,0 +1,43 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.deploy.k8s.integrationtest.backend
+
+import io.fabric8.kubernetes.client.DefaultKubernetesClient
+
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.minikube.MinikubeTestBackend
+
+private[spark] trait IntegrationTestBackend {
+  def initialize(): Unit
+  def getKubernetesClient: DefaultKubernetesClient
+  def cleanUp(): Unit = {}
+}
+
+private[spark] object IntegrationTestBackendFactory {
+  val DeployModeConfigKey = "spark.kubernetes.test.deployMode"
--- End diff --

Done


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191629581
  
--- Diff: 
resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh
 ---
@@ -0,0 +1,91 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
+UNPACKED_SPARK_TGZ="$TEST_ROOT_DIR/target/spark-dist-unpacked"
+IMAGE_TAG_OUTPUT_FILE="$TEST_ROOT_DIR/target/image-tag.txt"
+DEPLOY_MODE="minikube"
+IMAGE_REPO="docker.io/kubespark"
+IMAGE_TAG="N/A"
+SPARK_TGZ="N/A"
+
+# Parse arguments
+while (( "$#" )); do
+  case $1 in
+--unpacked-spark-tgz)
+  UNPACKED_SPARK_TGZ="$2"
+  shift
+  ;;
+--image-repo)
+  IMAGE_REPO="$2"
+  shift
+  ;;
+--image-tag)
+  IMAGE_TAG="$2"
+  shift
+  ;;
+--image-tag-output-file)
+  IMAGE_TAG_OUTPUT_FILE="$2"
+  shift
+  ;;
+--deploy-mode)
+  DEPLOY_MODE="$2"
+  shift
+  ;;
+--spark-tgz)
+  SPARK_TGZ="$2"
+  shift
+  ;;
+*)
+  break
+  ;;
+  esac
+  shift
+done
+
+if [[ $SPARK_TGZ == "N/A" ]];
+then
+  echo "Must specify a Spark tarball to build Docker images against with 
--spark-tgz." && exit 1;
--- End diff --

I'm sure it's possible. But I'd rather not try to do that refactor. I'd be 
really happy if you wanted to.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-29 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Ok, I think all open issues have been resolved. The PRB failures are 
because of github request failures, so they are spurious. @vanzin @mccheah I 
think it's ready for another look.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-29 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Ok, I think all open issues have been resolved. The PRB failures are 
because of github request failures, so they are spurious. @vanzin @mccheah I 
think it's ready for another look.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191487129
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesTestComponents.scala
 ---
@@ -0,0 +1,116 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+
+import scala.collection.JavaConverters._
+import scala.collection.mutable
+
+import io.fabric8.kubernetes.client.DefaultKubernetesClient
+import org.scalatest.concurrent.Eventually
+
+private[spark] class KubernetesTestComponents(defaultClient: 
DefaultKubernetesClient) {
+
+  val namespaceOption = 
Option(System.getProperty("spark.kubernetes.test.namespace"))
+  val hasUserSpecifiedNamespace = namespaceOption.isDefined
+  val namespace = 
namespaceOption.getOrElse(UUID.randomUUID().toString.replaceAll("-", ""))
+  private val serviceAccountName =
+Option(System.getProperty("spark.kubernetes.test.serviceAccountName"))
+  .getOrElse("default")
+  val kubernetesClient = defaultClient.inNamespace(namespace)
+  val clientConfig = kubernetesClient.getConfiguration
+
+  def createNamespace(): Unit = {
+defaultClient.namespaces.createNew()
+  .withNewMetadata()
+  .withName(namespace)
+  .endMetadata()
+  .done()
+  }
+
+  def deleteNamespace(): Unit = {
+defaultClient.namespaces.withName(namespace).delete()
+Eventually.eventually(KubernetesSuite.TIMEOUT, 
KubernetesSuite.INTERVAL) {
+  val namespaceList = defaultClient
+.namespaces()
+.list()
+.getItems
+.asScala
+  require(!namespaceList.exists(_.getMetadata.getName == namespace))
+}
+  }
+
+  def newSparkAppConf(): SparkAppConf = {
+new SparkAppConf()
+  .set("spark.master", s"k8s://${kubernetesClient.getMasterUrl}")
+  .set("spark.kubernetes.namespace", namespace)
+  .set("spark.executor.memory", "500m")
+  .set("spark.executor.cores", "1")
+  .set("spark.executors.instances", "1")
+  .set("spark.app.name", "spark-test-app")
+  .set("spark.ui.enabled", "true")
+  .set("spark.testing", "false")
+  .set("spark.kubernetes.submission.waitAppCompletion", "false")
+  .set("spark.kubernetes.authenticate.driver.serviceAccountName", 
serviceAccountName)
+  }
+}
+
+private[spark] class SparkAppConf {
+
+  private val map = mutable.Map[String, String]()
+
+  def set(key: String, value: String): SparkAppConf = {
+map.put(key, value)
+this
+  }
+
+  def get(key: String): String = map.getOrElse(key, "")
+
+  def setJars(jars: Seq[String]): Unit = set("spark.jars", 
jars.mkString(","))
+
+  override def toString: String = map.toString
+
+  def toStringArray: Iterable[String] = map.toList.flatMap(t => 
List("--conf", s"${t._1}=${t._2}"))
+}
+
+private[spark] case class SparkAppArguments(
+mainAppResource: String,
+mainClass: String,
+appArgs: Array[String])
+
+private[spark] object SparkAppLauncher extends Logging {
--- End diff --

This one is so much smaller (< 10 lines of executable code) than 
SparkLauncher, I think we should not try to switch in this CL.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191486261
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/resources/log4j.properties
 ---
@@ -0,0 +1,31 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Set everything to be logged to the file target/integration-tests.log
+log4j.rootCategory=INFO, file
--- End diff --

It's still necessary, it does not inherit.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191485369
  
--- Diff: resource-managers/kubernetes/integration-tests/pom.xml ---
@@ -0,0 +1,230 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+  4.0.0
+  
+org.apache.spark
+spark-parent_2.11
+2.4.0-SNAPSHOT
+../../../pom.xml
+  
+
+  spark-kubernetes-integration-tests_2.11
+  spark-kubernetes-integration-tests
+  
+3.3.9
+3.5
+1.1.1
+5.0.2
+1.3.0
+1.4.0
+
+18.0
+1.3.9
+3.0.0
+1.2.17
+2.11.8
+2.11
+3.2.2
+2.2.6
+1.0
+1.7.24
+kubernetes-integration-tests
+
${project.build.directory}/spark-dist-unpacked
+N/A
+
${project.build.directory}/imageTag.txt
+
minikube
+
docker.io/kubespark
+
+  
+  jar
+  Spark Project Kubernetes Integration Tests
+
+  
+
+  org.apache.spark
+  spark-core_${scala.binary.version}
+  ${project.version}
+
+
+
+  org.apache.spark
+  spark-core_${scala.binary.version}
+  ${project.version}
+  test-jar
+  test
+
+
+
+  commons-logging
+  commons-logging
+  ${commons-logging.version}
+
+
+  com.google.code.findbugs
+  jsr305
+  ${jsr305.version}
+
+
+  com.google.guava
+  guava
+  test
+  
+  ${guava.version}
+
+
+  com.spotify
+  docker-client
+  ${docker-client.version}
+  test
+
+
+  io.fabric8
+  kubernetes-client
+  ${kubernetes-client.version}
+
+
+  log4j
+  log4j
+  ${log4j.version}
+
+
+  org.apache.commons
+  commons-lang3
+  ${commons-lang3.version}
+
+
+  org.scala-lang
+  scala-library
+  ${scala.version}
+
+
+  org.scalatest
+  scalatest_${scala.binary.version}
+  test
+
+
+  org.slf4j
+  slf4j-log4j12
+  ${slf4j-log4j12.version}
+  test
+
+  
+
+  
+
+  
+net.alchim31.maven
--- End diff --

Resolved in 5a8fd7ff40


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191483976
  
--- Diff: resource-managers/kubernetes/integration-tests/pom.xml ---
@@ -0,0 +1,230 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+  4.0.0
+  
+org.apache.spark
+spark-parent_2.11
+2.4.0-SNAPSHOT
+../../../pom.xml
+  
+
+  spark-kubernetes-integration-tests_2.11
+  spark-kubernetes-integration-tests
+  
+3.3.9
+3.5
+1.1.1
+5.0.2
+1.3.0
+1.4.0
+
+18.0
+1.3.9
+3.0.0
+1.2.17
+2.11.8
+2.11
+3.2.2
+2.2.6
+1.0
+1.7.24
+kubernetes-integration-tests
+
${project.build.directory}/spark-dist-unpacked
+N/A
+
${project.build.directory}/imageTag.txt
+
minikube
+
docker.io/kubespark
+
+  
+  jar
+  Spark Project Kubernetes Integration Tests
+
+  
+
+  org.apache.spark
+  spark-core_${scala.binary.version}
+  ${project.version}
+
+
+
+  org.apache.spark
+  spark-core_${scala.binary.version}
+  ${project.version}
+  test-jar
+  test
+
+
+
+  commons-logging
+  commons-logging
+  ${commons-logging.version}
+
+
+  com.google.code.findbugs
+  jsr305
+  ${jsr305.version}
+
+
+  com.google.guava
+  guava
+  test
+  
+  ${guava.version}
+
+
+  com.spotify
+  docker-client
+  ${docker-client.version}
+  test
+
+
+  io.fabric8
+  kubernetes-client
+  ${kubernetes-client.version}
+
+
+  log4j
+  log4j
+  ${log4j.version}
--- End diff --

Resolved in 901edb3ba3, f68cdba77b, dd280327ea


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191483921
  
--- Diff: resource-managers/kubernetes/integration-tests/pom.xml ---
@@ -0,0 +1,230 @@
+
+
+http://maven.apache.org/POM/4.0.0; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd;>
+  4.0.0
+  
+org.apache.spark
+spark-parent_2.11
+2.4.0-SNAPSHOT
+../../../pom.xml
+  
+
+  spark-kubernetes-integration-tests_2.11
+  spark-kubernetes-integration-tests
+  
+3.3.9
+3.5
--- End diff --

Resolved in 901edb3ba3, f68cdba77b, dd280327ea


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-29 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191474516
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/Logging.scala
 ---
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import org.apache.log4j.{Logger, LogManager, Priority}
+
+trait Logging {
--- End diff --

Resolved in cfb8ee94e11b4871f9b8c7db4774bdb6cb42c40e


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-28 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
I'll work on Matt's comments from Friday next.

Here's the output (after the bugfix) from running against mainline:

```
MBP:~/src/ssuchter-spark% git remote get-url origin
https://github.com/ssuchter/spark.git

MBP:~/src/ssuchter-spark% git remote get-url upstream
git://github.com/apache/spark.git

1d8a265d13 (HEAD -> ssuchter-k8s-integration-tests, 
origin/ssuchter-k8s-integration-tests) Fix a bug in KubernetesTestComponents - 
don't an an empty string for zero arguments
65347b319a Merge branch 'ssuchter-k8s-integration-tests' of 
https://github.com/ssuchter/spark into ssuchter-k8s-integration-tests
1a531abcf6 Remove unused code relating to Kerberos, which doesn't belong in 
this PR
3ba6ffb5f2 Remove e2e-prow.sh, which isn't appropriate for this PR
9e64f43b62 Remove unnecessary cloning and building code for the Spark repo
e6bd56325d Update README.md excluding cloning and building logic
e70f3bea3d Remove K8s cloud-based backend testing support from this PR
a0023b2f33 Remove config options that were only used during repo clone 
process
e55b8a723e Remove repository cloning behavior and allow script to be called 
from other directories
9b0eede244 Fixes for scala style
f29679ef56 Ignore dist/ for style checks
3615953bea Fix scala style issues
bef586f740 Remove LICENSE and copy of mvn wrapper script. Rewrite path for 
calling mvn wrapper script.
81c7a66ad6 Make k8s integration tests build when top-level kubernetes 
profile selected
365d6bc65d Initial checkin of k8s integration tests. These tests were 
developed in the https://github.com/apache-spark-on-k8s/spark-integration repo 
by several contributors. This is a copy of the current state into the main 
apache spark repo. The only changes from the current spark-integration repo 
state are: * Move the files from the repo root into 
resource-managers/kubernetes/integration-tests * Add a reference to these tests 
in the root README.md * Fix a path reference in 
dev/dev-run-integration-tests.sh * Add a TODO in include/util.sh
dbce275784 Remove unused code relating to Kerberos, which doesn't belong in 
this PR
5ffa464c65 Remove e2e-prow.sh, which isn't appropriate for this PR
13721f69a2 Remove unnecessary cloning and building code for the Spark repo
ba720733fa Update README.md excluding cloning and building logic
1b1528a504 (upstream/master, origin/master, origin/HEAD, master) 
[SPARK-24366][SQL] Improving of error messages for type converting

MBP:~/src/ssuchter-spark% echo $REVISION
1d8a265d13
MBP:~/src/ssuchter-spark% echo $DATE
20180528

MBP:~/src/ssuchter-spark% ./dev/make-distribution.sh --name 
${DATE}-${REVISION} --tgz -DzincPort=${ZINC_PORT} -Phadoop-2.7 -Pkubernetes 
-Pkinesis-asl -Phive -Phive-thriftserver
+++ dirname ./dev/make-distribution.sh
++ cd ./dev/..
++ pwd
+ SPARK_HOME=/Users/ssuchter/src/ssuchter-spark
+ DISTDIR=/Users/ssuchter/src/ssuchter-spark/dist
+ MAKE_TGZ=false
+ MAKE_PIP=false
+ MAKE_R=false
…
+ TARDIR_NAME=spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13
+ 
TARDIR=/Users/ssuchter/src/ssuchter-spark/spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13
+ rm -rf 
/Users/ssuchter/src/ssuchter-spark/spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13
+ cp -r /Users/ssuchter/src/ssuchter-spark/dist 
/Users/ssuchter/src/ssuchter-spark/spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13
+ tar czf spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13.tgz -C 
/Users/ssuchter/src/ssuchter-spark spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13
+ rm -rf 
/Users/ssuchter/src/ssuchter-spark/spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13

MBP:~/src/ssuchter-spark% 
./resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh
 --spark-tgz 
~/src/ssuchter-spark/spark-2.4.0-SNAPSHOT-bin-20180528-1d8a265d13.tgz
Using `mvn` from path: /usr/local/bin/mvn
[INFO] Scanning for projects...
[INFO]
[INFO] 

[INFO] Building Spark Project Kubernetes Integration Tests 2.4.0-SNAPSHOT
[INFO] 

[INFO]
[INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-versions) @ 
spark-kubernetes-integration-tests_2.11 ---
…
Successfully tagged kubespark/spark:68388D3B-6FAC-4E59-8AED-8604AA437C2D

/Users/ssuchter/src/ssuchter-spark/resource-managers/kubernetes/integration-tests
[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ 
spark-kubernetes-integration-tests_2.11 ---
Discovery starting.
Discovery completed in 118 milliseconds.
Run starting. Expected test count is: 1
KubernetesSuite:
- Run SparkPi with no resources
Run completed in

[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-28 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Fixed the bug. @mccheah I'd appreciate your eyes on commit 1d8a265, for 
both correctness and style. (I haven't used Scala before this project, so I'm 
very not confidence in the best way to do things.)


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Hm, I'm glad I tried to paste output of running against master (well, 
strictly my rebase of this PR, but functionally the same thing). It's throwing 
some error that I need to debug:

  2018-05-26 00:46:32 INFO  KubernetesClusterSchedulerBackend:54 - 
SchedulerBackend is ready for scheduling beginning after reached 
minRegisteredResourcesRatio: 0.8
  Exception in thread "main" java.lang.NumberFormatException: For input 
string: ""
at 
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:592)
at java.lang.Integer.parseInt(Integer.java:615)
at 
scala.collection.immutable.StringLike$class.toInt(StringLike.scala:272)
at scala.collection.immutable.StringOps.toInt(StringOps.scala:29)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:32)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:840)
at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:194)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:915)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:926)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Ok, I didn't do anything wrong. There is a bug in the Github web UI that 
the PR's web view doesn't get updated when I rebase it. I had to change what 
Github thought the branch I was merging into to 2.3 and then back to master, 
and now it shows correctly just my diffs.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Oops, I really did something wrong there. Please ignore while I fix.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
FYI - I just rebased to an updated master.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Thanks for the pointer about @ifilonenko's comment. I removed the 
Kerberos/secrets related code. I'll post the output momentarily.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-25 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191031379
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
@@ -0,0 +1,391 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest
+
+import java.io.File
+import java.nio.file.{Path, Paths}
+import java.util.UUID
+import java.util.regex.Pattern
+
+import scala.collection.JavaConverters._
+import com.google.common.io.PatternFilenameFilter
+import io.fabric8.kubernetes.api.model.{Container, Pod}
+import org.scalatest.{BeforeAndAfter, BeforeAndAfterAll, FunSuite}
+import org.scalatest.concurrent.{Eventually, PatienceConfiguration}
+import org.scalatest.time.{Minutes, Seconds, Span}
+
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.{IntegrationTestBackend, 
IntegrationTestBackendFactory}
+import org.apache.spark.deploy.k8s.integrationtest.config._
+
+private[spark] class KubernetesSuite extends FunSuite with 
BeforeAndAfterAll with BeforeAndAfter {
+
+  import KubernetesSuite._
+
+  private var testBackend: IntegrationTestBackend = _
+  private var sparkHomeDir: Path = _
+  private var kubernetesTestComponents: KubernetesTestComponents = _
+  private var sparkAppConf: SparkAppConf = _
+  private var image: String = _
+  private var containerLocalSparkDistroExamplesJar: String = _
+  private var appLocator: String = _
+  private var driverPodName: String = _
+
+  override def beforeAll(): Unit = {
+// The scalatest-maven-plugin gives system properties that are 
referenced but not set null
+// values. We need to remove the null-value properties before 
initializing the test backend.
+val nullValueProperties = System.getProperties.asScala
+  .filter(entry => entry._2.equals("null"))
+  .map(entry => entry._1.toString)
+nullValueProperties.foreach { key =>
+  System.clearProperty(key)
+}
+
+val sparkDirProp = 
System.getProperty("spark.kubernetes.test.unpackSparkDir")
+require(sparkDirProp != null, "Spark home directory must be provided 
in system properties.")
+sparkHomeDir = Paths.get(sparkDirProp)
+require(sparkHomeDir.toFile.isDirectory,
+  s"No directory found for spark home specified at $sparkHomeDir.")
+val imageTag = getTestImageTag
+val imageRepo = getTestImageRepo
+image = s"$imageRepo/spark:$imageTag"
+
+val sparkDistroExamplesJarFile: File = 
sparkHomeDir.resolve(Paths.get("examples", "jars"))
+  .toFile
+  .listFiles(new 
PatternFilenameFilter(Pattern.compile("^spark-examples_.*\\.jar$")))(0)
+containerLocalSparkDistroExamplesJar = 
s"local:///opt/spark/examples/jars/" +
+  s"${sparkDistroExamplesJarFile.getName}"
+testBackend = IntegrationTestBackendFactory.getTestBackend
+testBackend.initialize()
+kubernetesTestComponents = new 
KubernetesTestComponents(testBackend.getKubernetesClient)
+  }
+
+  override def afterAll(): Unit = {
+testBackend.cleanUp()
+  }
+
+  before {
+appLocator = UUID.randomUUID().toString.replaceAll("-", "")
+driverPodName = "spark-test-app-" + 
UUID.randomUUID().toString.replaceAll("-", "")
+sparkAppConf = kubernetesTestComponents.newSparkAppConf()
+  .set("spark.kubernetes.container.image", image)
+  .set("spark.kubernetes.driver.pod.name", driverPodName)
+  .set("spark.kubernetes.driver.label.spark-app-locator", appLocator)
+  .set("spark.kubernetes.executor.label.spark-app-locator", appLocator)
+   

[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
I saw that @mccheah put a comment in the e2e-prow.sh that we don't need it 
here, so I took that as an answer to my previous question.

I think I got rid of the cloning and building Spark source code. I also 
updated README.md - PTAL.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-25 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191028169
  
--- Diff: resource-managers/kubernetes/integration-tests/e2e/e2e-prow.sh ---
@@ -0,0 +1,69 @@
+#!/bin/bash
+
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+### This script is used by Kubernetes Test Infrastructure to run 
integration tests.
+### See documenation at 
https://github.com/kubernetes/test-infra/tree/master/prow
+
+set -ex
+
+# set cwd correctly
+cd "$(dirname "$0")/../"
+
+# Include requisite scripts
+source ./include/util.sh
+
+TEST_ROOT_DIR=$(git rev-parse --show-toplevel)
+BRANCH="master"
+SPARK_REPO="https://github.com/apache/spark;
+SPARK_REPO_LOCAL_DIR="$TEST_ROOT_DIR/target/spark"
+
+## Install basic dependencies
+## These are for the kubekins-e2e environment in 
https://github.com/kubernetes/test-infra/tree/master/images/kubekins-e2e
+echo "deb http://http.debian.net/debian jessie-backports main" >> 
/etc/apt/sources.list
+apt-get update && apt-get install -y curl wget git tar uuid-runtime
--- End diff --

Removed, thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-05-25 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r191028204
  
--- Diff: resource-managers/kubernetes/integration-tests/include/util.sh ---
@@ -0,0 +1,43 @@
+#!/usr/bin/env bash
+
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+clone_build_spark() {
--- End diff --

Removed, thanks


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
I will take a look at the docs too.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
You're right, the prow-based codepath still does the cloning. Sorry! I did 
remove the cloning and building option in the entry point 
resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh,
 so it's not all still there.

@mccheah - Why do we need the prow-based codepath at all in this PR? AMPLab 
jenkins doesn't use it. It only uses 
resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
I fixed all the other issues (removal of cloud-based k8s backend, 
repository clone logic).

@erikerlandson @mccheah @vanzin PTAL.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
> If not all hosts currently can run them, then it's ok to postpone this.

That's exactly what I'm proposing too, so it seems like we agree. I also 
agree, btw, on the future state - when all hosts can run this, then 
incorporating into dev/run-tests.py is a good idea.

I'm going to proceed by fixing all other open issues, and ping when this 
can get another review, then.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Hm, that wasn't the answer I expected to hear. It leads me to another 
question:

If we don't have developers using this script directly, what's the value of 
trying to incorporate the running of the Kubernetes integration tests into this 
program? Because of the AMPLab setup, we're going to have to have special 
Jenkins jobs dedicated to the running of the K8s integration tests. This is 
because not all hosts can run these integration tests, so therefore we have 
special jobs that are tied to the specific hosts that have the right setup to 
run the minikube VMs. So it's just going to be those jobs that call 
dev/run-tests.py with the special non-default argument to run these tests. 
Those jenkins jobs could just as easily run a different script.

An alternative way to take my argument is that we should delay the 
integration into run-tests.py until after **all** of the AMPLab hosts can run 
these tests. Since that's not today, it wouldn't need to be in this initial PR, 
and could have its own PR.

Thoughts? Thanks, Sean


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-25 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
@vanzin @felixcheung I'm trying to look into this again. My main question 
is how people use this dev/run-tests.py suite. Every time I run it, even though 
I've only changed things in the kubernetes resourcemanager section, it takes 
hours to run, and isn't even running the kubernetes tests. Is there some 
command line flags or ways that people use this tool to just select the tests 
that they want to run? Please feel free to point me at a doc to read, I'm just 
not sure which one to read.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21317: [SPARK-24232][k8s] Add support for secret env vars

2018-05-17 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21317
  
Hm, I started and stopped minikube, but I don't know why that would fix 
this. I suspect that it was actually fixed by the state on the node changing in 
the last few hours. Please highlight this issue if it recurs.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21317: [SPARK-24232][k8s] Add support for secret env vars

2018-05-17 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21317
  
retest this please


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #21317: [SPARK-24232][k8s] Add support for secret env vars

2018-05-17 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/21317
  
Something seems odd on the minikube on AMPLab jenkins. Looking…

On May 17, 2018 at 10:57:32 AM, Anirudh Ramanathan 
(notificati...@github.com)
wrote:

@skonto <https://github.com/skonto>, looks like a test runner issue here.
@ssuchter <https://github.com/ssuchter>, can you PTAL?

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<https://github.com/apache/spark/pull/21317#issuecomment-389954633>, or mute
the thread

<https://github.com/notifications/unsubscribe-auth/AB2E2-BX0kK9J9cFQIkhMncBP-__0JEFks5tzboLgaJpZM4T9F34>
.



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-05-02 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
What about the issue of moving the invocation of the tests into 
run-tests.py? Erik - do you not believe that is a requirement?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-03-07 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
Yeah, I just figured that out. (That it's because of the dist/ directory.) 
I'll change dev/tox.ini so I don't have to keep cleaning. Thanks!


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-03-07 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
@vanzin I'm working on the run-tests.py integration. It seems to be failing 
on a bunch of comments on the file dist/python/docs/conf.py

./dist/python/docs/conf.py:258:1: E265 block comment should start with '# '
./dist/python/docs/conf.py:261:1: E265 block comment should start with '# '
./dist/python/docs/conf.py:264:1: E265 block comment should start with '# '

This change isn't involved with that file. Should I be fixing that file, or 
is there a way that people typically run run-tests.py to skip it? (If there's a 
place I should be reading about this, please feel free to point me there.)


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-03-06 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r172707045
  
--- Diff: resource-managers/kubernetes/integration-tests/include/util.sh ---
@@ -0,0 +1,43 @@
+#!/usr/bin/env bash
+
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+clone_build_spark() {
--- End diff --

Agree, this code is something that should be removed.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-03-06 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
I agree with your point about using a non-tgz version, but I'd prefer to do 
that as a separate change. (However, I reserve my right to change this stance 
if it turns out to be easier to make the other changes to this PR in a way that 
runs without the .tgz file)


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #20697: [SPARK-23010][k8s] Initial checkin of k8s integra...

2018-03-06 Thread ssuchter
Github user ssuchter commented on a diff in the pull request:

https://github.com/apache/spark/pull/20697#discussion_r172691439
  
--- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/backend/cloud/CloudTestBackend.scala
 ---
@@ -0,0 +1,41 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.deploy.k8s.integrationtest.backend.cloud
+
+import io.fabric8.kubernetes.client.{ConfigBuilder, 
DefaultKubernetesClient}
+
+import org.apache.spark.deploy.k8s.integrationtest.Utils
+import 
org.apache.spark.deploy.k8s.integrationtest.backend.IntegrationTestBackend
+
+private[spark] object CloudTestBackend extends IntegrationTestBackend {
--- End diff --

Certainly seems possible; I agree it would aid review. There's a bunch of 
changes I need to make, I'll include this point.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-03-01 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
It's hard to detect when there are interesting changes, because changes to 
spark core code can meaningfully affect the K8s code paths. Many, many changes 
might be considered "interesting." As such, the plan was to have a dedicated 
PRB for doing these tests that triggers on any build that mentions "kubernetes" 
or "k8s" in the PR description.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: [SPARK-23010][k8s] Initial checkin of k8s integration te...

2018-03-01 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
@vanzin I fixed the title (thanks @liyinan926) with the bug.

I'll work on integrating its build with the main build.

In terms of integration with the dev/run-tests.py, we at most want to make 
it optional. These tests have an additional system requirement (minikube) that 
I wouldn't want to break existing systems that use run-tests.py. What do you 
think?

With respect to the AMPLab builds - in previous discussing with 
@shaneknapp, the new Ubuntu-based systems are the only ones set up to run 
minikube, and jobs that exercise these tests are restricted to those systems. I 
think we should stick with that, because the direction (as I understand it) is 
to make all the systems follow that profile, and then we'll achieve the goal of 
having all systems able to run this integration test.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #20697: Initial checkin of k8s integration tests.

2018-02-28 Thread ssuchter
Github user ssuchter commented on the issue:

https://github.com/apache/spark/pull/20697
  
I should note that the integration test success that came from SparkQA is 
**not** using the copy of the integration test code mentioned in this PR. It is 
using the original copy of the code in 
https://github.com/apache-spark-on-k8s/spark-integration. Once this is merged, 
I will create additional Jenkins jobs on the AMPLab infrastructure that 
exercises this new copy of the integration test. Once those are working, I will 
remove the Jenkins jobs that use the apache-spark-on-k8s/spark-integration repo.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



  1   2   >