dongjoon-hyun commented on code in PR #52923:
URL: https://github.com/apache/spark/pull/52923#discussion_r2501567180
##########
core/src/main/scala/org/apache/spark/SparkEnv.scala:
##########
@@ -369,6 +371,11 @@ object SparkEnv extends Logging {
logInfo(log"Registering ${MDC(LogKeys.ENDPOINT_NAME, name)}")
rpcEnv.setupEndpoint(name, endpointCreator)
} else {
+ val useDriverPodIP =
+ conf.get("spark.kubernetes.executor.useDriverPodIP",
"false").equalsIgnoreCase("true")
+ if (useDriverPodIP) {
+ conf.set(config.DRIVER_HOST_ADDRESS.key,
conf.get(config.DRIVER_BIND_ADDRESS.key))
Review Comment:
> Additionally, given spark.kubernetes.executor.useDriverPodIP is intended
to only affect K8s mode, it's better to have a condition check before
overwriting.
For the condition check, this is already graded by `useDriverPodIP`
condition check at line 376, @pan3793 .
> I read [SPARK-4563](https://issues.apache.org/jira/browse/SPARK-4563), and
I believe this is a central location to set up the driver advertised address,
so we don't need to overwrite it in other places one by one.
Got it. For the above concern, let me check more, @pan3793 . BTW, it would
be great if you can give me your specific use cases (if something is broken
your side. 😄 )
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]