Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20327#discussion_r176585546
  
    --- Diff: 
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
 ---
    @@ -136,6 +135,39 @@ class YarnClusterSuite extends BaseYarnClusterSuite {
         checkResult(finalState, result)
       }
     
    +  private def testClusterDriverBind(
    +      uiEnabled: Boolean,
    +      localHost: String,
    +      localIp: String,
    +      success: Boolean): Unit = {
    +    val result = File.createTempFile("result", null, tempDir)
    +    val finalState = runSpark(false, 
mainClassName(YarnClusterDriver.getClass),
    +      appArgs = Seq(result.getAbsolutePath()),
    +      extraConf = Map(
    +        "spark.yarn.appMasterEnv.SPARK_LOCAL_HOSTNAME" -> localHost,
    +        "spark.yarn.appMasterEnv.SPARK_LOCAL_IP" -> localIp,
    +        "spark.ui.enabled" -> uiEnabled.toString
    +      ))
    +    if (success) {
    +      checkResult(finalState, result, "success")
    +    } else {
    +      finalState should be (SparkAppHandle.State.FAILED)
    +    }
    +  }
    +
    +  test("yarn-cluster driver should be able to bind listeners to MM_HOST") {
    --- End diff --
    
    > the regression of not being able to bind webUI to a specific interface is 
fixed
    
    This can be a unit test for `JettyUtils` if you really care about that.
    
    > they demonstrate how to bind RPC and webUI to different interfaces
    
    This can be added to the docs. Nobody is going to look at unit test code to 
figure out how to do that.
    
    > It's not wrong. it's one of the reasonable choices. 
    
    It actually is, because it assumes that both the RM and the Spark app have 
the same configuration w.r.t. which interfaces they're binding to.
    
    But as you say, it's better to use `$NM_HOST` to find the NM's address 
instead of the current code.
    
    > We need the fix in JettyUtils
    
    There's a separate PR with that fix too (the one I referenced above was 
closed and the correct one was opened as #20883).



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to