Github user skonto commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21652#discussion_r200135218
  
    --- Diff: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesSuite.scala
 ---
    @@ -265,6 +323,37 @@ private[spark] class KubernetesSuite extends 
SparkFunSuite
         assert(envVars("ENV2") === "VALUE2")
       }
     
    +  private def executeCommand(cmd: String*)(implicit podName: String): 
String = {
    +    val out = new ByteArrayOutputStream()
    +    val watch = kubernetesTestComponents
    +      .kubernetesClient
    +      .pods()
    +      .withName(podName)
    +      .readingInput(System.in)
    +      .writingOutput(out)
    +      .writingError(System.err)
    +      .withTTY()
    +      .exec(cmd.toArray: _*)
    +    // wait to get some result back
    +    Thread.sleep(1000)
    --- End diff --
    
    @ssuchter I tried the approach here: 
https://github.com/fabric8io/kubernetes-client/blob/master/kubernetes-examples/src/main/java/io/fabric8/kubernetes/examples/ExecLoopExample.java
    and also mentioned here:
    
https://github.com/fabric8io/kubernetes-client/blob/master/kubernetes-examples/src/main/java/io/fabric8/kubernetes/examples/ExecPipesExample.java#L60
    Latches were not useful to me. The examples there use the timeout approach 
(actually I recall that I looked at the examples before writing the code here).
    The ideal approach would be to check continuously on the watch output on a 
feature and wait until it completes with some time out. But that does not seem 
to work for me.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to