Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r175873355
--- Diff: core/src/main/scala/org/apache/spark/ui/WebUI.scala ---
@@ -126,7 +126,11 @@ private[spark] abstract class WebUI(
def bind(): Unit = {
assert(serverInfo.isEmpty, s"Attempted to bind $className more than
once!")
try {
- val host = Option(conf.getenv("SPARK_LOCAL_IP")).getOrElse("0.0.0.0")
+ val host = if (Utils.isClusterMode(conf)) {
--- End diff --
You can set `SPARK_LOCAL_IP`.
If you really want to change this, you must not change the current default
behavior, which is to bind to all interfaces. If you manage to do that without
doing cluster-specific checks, then I'm ok with it. But none of your current
solutions really pass that bar.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]