This is an automated email from the ASF dual-hosted git repository.
chengpan pushed a commit to branch branch-1.7
in repository https://gitbox.apache.org/repos/asf/kyuubi.git
The following commit(s) were added to refs/heads/branch-1.7 by this push:
new aa7a02682 [KYUUBI #5148] Improve spark.driver.host assignment in Spark
on K8s client mode
aa7a02682 is described below
commit aa7a026823c6f5d3ec3694e17f93eaf89693f3ec
Author: marcoluo <[email protected]>
AuthorDate: Thu Aug 10 00:49:53 2023 +0800
[KYUUBI #5148] Improve spark.driver.host assignment in Spark on K8s client
mode
### _Why are the changes needed?_
Before this PR, the ip address obtained in the K8s environment is
incorrect, and `spark.driver.host` cannot be manually specified.
This time pr will adjust the way the IP is obtained and support the
external parameter `spark.driver.host`
### _How was this patch tested?_
It has been tested in the local environment and verified as expected
Add screenshots for manual tests if appropriate

Closes #5148 from fantasticKe/feat_k8s_driver_ip.
Closes #5148
330f5f93d [marcoluo] feat: 修改k8s模式下获取spark.driver.host的方式
Authored-by: marcoluo <[email protected]>
Signed-off-by: Cheng Pan <[email protected]>
(cherry picked from commit 1dbb31b601e9be2879c02fcfb46b1435f87df92d)
Signed-off-by: Cheng Pan <[email protected]>
---
.../src/main/scala/org/apache/kyuubi/engine/spark/SparkSQLEngine.scala | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)
diff --git
a/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/SparkSQLEngine.scala
b/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/SparkSQLEngine.scala
index bdbc7c08f..9fe49b4b0 100644
---
a/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/SparkSQLEngine.scala
+++
b/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/SparkSQLEngine.scala
@@ -17,7 +17,6 @@
package org.apache.kyuubi.engine.spark
-import java.net.InetAddress
import java.time.Instant
import java.util.{Locale, UUID}
import java.util.concurrent.{CountDownLatch, ScheduledExecutorService,
ThreadPoolExecutor, TimeUnit}
@@ -226,7 +225,7 @@ object SparkSQLEngine extends Logging {
if (!isOnK8sClusterMode) {
// set driver host to ip instead of kyuubi pod name
- _sparkConf.set("spark.driver.host",
InetAddress.getLocalHost.getHostAddress)
+ _sparkConf.setIfMissing("spark.driver.host",
Utils.findLocalInetAddress.getHostAddress)
}
}