This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8d1212837538 [SPARK-46797][CORE] Rename `spark.deploy.spreadOut` to 
`spark.deploy.spreadOutApps`
8d1212837538 is described below

commit 8d121283753894d4969d8ff9e09bb487f76e82e1
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Mon Jan 22 16:26:43 2024 -0800

    [SPARK-46797][CORE] Rename `spark.deploy.spreadOut` to 
`spark.deploy.spreadOutApps`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to rename `spark.deploy.spreadOut` to 
`spark.deploy.spreadOutApps`.
    
    ### Why are the changes needed?
    
    Although Apache Spark documentation clearly says it's about `applications`, 
this still misleads users to forget `Driver` JVMs which will be spread out 
always independently from this configuration.
    
    
https://github.com/apache/spark/blob/b80e8cb4552268b771fc099457b9186807081c4a/docs/spark-standalone.md?plain=1#L282-L285
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, the behavior is the same. Only it will show warnings for old config 
name usages.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #44838 from dongjoon-hyun/SPARK-46797.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 core/src/main/scala/org/apache/spark/internal/config/Deploy.scala | 3 ++-
 docs/spark-standalone.md                                          | 2 +-
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/core/src/main/scala/org/apache/spark/internal/config/Deploy.scala 
b/core/src/main/scala/org/apache/spark/internal/config/Deploy.scala
index 6585d62b3b9c..31ac07621176 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/Deploy.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/Deploy.scala
@@ -97,8 +97,9 @@ private[spark] object Deploy {
     .intConf
     .createWithDefault(10)
 
-  val SPREAD_OUT_APPS = ConfigBuilder("spark.deploy.spreadOut")
+  val SPREAD_OUT_APPS = ConfigBuilder("spark.deploy.spreadOutApps")
     .version("0.6.1")
+    .withAlternative("spark.deploy.spreadOut")
     .booleanConf
     .createWithDefault(true)
 
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index b9e3bb5d3f7f..6e454dff1bde 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -279,7 +279,7 @@ SPARK_MASTER_OPTS supports the following system properties:
   <td>1.1.0</td>
 </tr>
 <tr>
-  <td><code>spark.deploy.spreadOut</code></td>
+  <td><code>spark.deploy.spreadOutApps</code></td>
   <td>true</td>
   <td>
     Whether the standalone cluster manager should spread applications out 
across nodes or try


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to