This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new b9d379a6b84b [SPARK-45777][CORE] Support `spark.test.appId` in
`LocalSchedulerBackend`
b9d379a6b84b is described below
commit b9d379a6b84b67b29ccb578938764b888d64f293
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Nov 2 23:13:27 2023 -0700
[SPARK-45777][CORE] Support `spark.test.appId` in `LocalSchedulerBackend`
### What changes were proposed in this pull request?
This PR aims to support `spark.test.appId` in `LocalSchedulerBackend` like
the following.
```
$ bin/spark-shell --driver-java-options="-Dspark.test.appId=test-app-2023"
...
Spark context available as 'sc' (master = local[*], app id = test-app-2023).
```
```
$ bin/spark-shell -c spark.test.appId=test-app-2026 -c
spark.eventLog.enabled=true -c spark.eventLog.dir=/Users/dongjoon/data/history
...
Spark context available as 'sc' (master = local[*], app id = test-app-2026).
```
### Why are the changes needed?
Like the other `spark.test.*` configurations, this enables the developers
control the appId in `LocalSchedulerBackend`.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manual.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #43645 from dongjoon-hyun/SPARK-45777.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git
a/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala
b/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala
index 79084b75f6c3..a00fe2a06899 100644
---
a/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala
+++
b/core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala
@@ -110,7 +110,7 @@ private[spark] class LocalSchedulerBackend(
val totalCores: Int)
extends SchedulerBackend with ExecutorBackend with Logging {
- private val appId = "local-" + System.currentTimeMillis
+ private val appId = conf.get("spark.test.appId", "local-" +
System.currentTimeMillis)
private var localEndpoint: RpcEndpointRef = null
private val userClassPath = getUserClasspath(conf)
private val listenerBus = scheduler.sc.listenerBus
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]