Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3893#discussion_r22512468
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -701,7 +701,7 @@ private[spark] object Utils extends Logging {
         }
       }
     
    -  private var customHostname: Option[String] = None
    +  private var customHostname: Option[String] = 
sys.env.get("SPARK_LOCAL_HOSTNAME")
    --- End diff --
    
    There is already an environment variable called `SPARK_PUBLIC_DNS` in the 
docs. This is used to override the default host name in some cases (however, 
confusingly, in a smaller subset of cases). I wonder if we should just fall 
back to SPARK_PUBLIC_DNS here and expand its scope slightly. We'd need to audit 
all of the cases where this is used, but that might be preferable to 
introducing another override. We could also just have two overrides here, and 
we explain that SPARK_PUBLIC_DNS takes precedence and is only used in certain 
cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to