This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new fd3b61033304 [SPARK-50222][CORE] Support `spark.submit.appName`
fd3b61033304 is described below
commit fd3b61033304f38e7f9e4a738c55c7f6d86ad0af
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Tue Nov 5 10:07:22 2024 -0800
[SPARK-50222][CORE] Support `spark.submit.appName`
### What changes were proposed in this pull request?
This PR aims to support `spark.submit.appName`.
This is very useful for SRE and admins to re-submit a job multiple times.
### Why are the changes needed?
Usually, `appName` is fixed at compile time like the following and we
cannot override with `-Dspark.app.name=xxx`. This PR aims to provide a way to
override `appName` during submission time.
https://github.com/apache/spark/blob/0d2d031c2d907393ad6933677ea90ec95a652d50/examples/src/main/scala/org/apache/spark/examples/SparkPi.scala#L28-L31
**EXAMPLE 1**
```
$ bin/run-example -c spark.app.name=SPARK-50222 SparkPi 2>&1 | grep
SPARK-50222 | jq
$ bin/run-example -c spark.submit.appName=SPARK-50222 SparkPi 2>&1 | grep
SPARK-50222 | jq
{
"ts": "2024-11-05T01:12:57.434Z",
"level": "INFO",
"msg": "Submitted application: SPARK-50222",
"context": {
"app_name": "SPARK-50222"
},
"logger": "SparkContext"
}
```
**EXAMPLE 2**
```
$ bin/spark-shell -c spark.app.name=SPARK-50222
scala> sc.appName
val res0: String = Spark shell
$ bin/spark-shell --driver-java-options "-Dspark.app.name=SPARK-50222"
scala> sc.appName
val res0: String = Spark shell
$ bin/spark-shell -c spark.submit.appName=SPARK-50222
scala> sc.appName
val res0: String = SPARK-50222
```
### Does this PR introduce _any_ user-facing change?
No, this is a new configuration.
### How was this patch tested?
Pass the CIs with a newly added test case.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #48755 from dongjoon-hyun/SPARK-50222.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala | 3 +++
.../scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala | 9 +++++++++
2 files changed, 12 insertions(+)
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
index 823356af3195..0c2aa6f941a2 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala
@@ -889,6 +889,9 @@ object SparkSession extends api.BaseSparkSessionCompanion
with Logging {
// No active nor global default session. Create a new one.
val sparkContext = userSuppliedContext.getOrElse {
+ // Override appName with the submitted appName
+ sparkConf.getOption("spark.submit.appName")
+ .map(sparkConf.setAppName)
// set a random app name if not given.
if (!sparkConf.contains("spark.app.name")) {
sparkConf.setAppName(java.util.UUID.randomUUID().toString)
diff --git
a/sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala
b/sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala
index d3117ec411fe..a30b13df74ae 100644
---
a/sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala
+++
b/sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala
@@ -589,4 +589,13 @@ class SparkSessionBuilderSuite extends SparkFunSuite with
Eventually {
assert(session.conf.get(e._1) == e._2.toString)
}
}
+
+ test("SPARK-50222: Support spark.submit.appName") {
+ val session = SparkSession.builder()
+ .master("local")
+ .appName("appName")
+ .config("spark.submit.appName", "newAppName")
+ .getOrCreate()
+ assert(session.sparkContext.appName === "newAppName")
+ }
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]