This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new c1dad17c48e [SPARK-45961][DOCS] Document `spark.master.*` 
configurations
c1dad17c48e is described below

commit c1dad17c48e17d30b284f4d6082766086d1cb7d4
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Thu Nov 16 15:36:05 2023 -0800

    [SPARK-45961][DOCS] Document `spark.master.*` configurations
    
    ### What changes were proposed in this pull request?
    
    This PR documents `spark.master.*` configurations.
    
    ### Why are the changes needed?
    
    Currently, `spark.master.*` configurations are undocumented.
    ```
    $ git grep 'ConfigBuilder("spark.master'
    core/src/main/scala/org/apache/spark/internal/config/UI.scala:  val 
MASTER_UI_DECOMMISSION_ALLOW_MODE = 
ConfigBuilder("spark.master.ui.decommission.allow.mode")
    core/src/main/scala/org/apache/spark/internal/config/package.scala:  
private[spark] val MASTER_REST_SERVER_ENABLED = 
ConfigBuilder("spark.master.rest.enabled")
    core/src/main/scala/org/apache/spark/internal/config/package.scala:  
private[spark] val MASTER_REST_SERVER_PORT = 
ConfigBuilder("spark.master.rest.port")
    core/src/main/scala/org/apache/spark/internal/config/package.scala:  
private[spark] val MASTER_UI_PORT = ConfigBuilder("spark.master.ui.port")
    core/src/main/scala/org/apache/spark/internal/config/package.scala:    
ConfigBuilder("spark.master.ui.historyServerUrl")
    core/src/main/scala/org/apache/spark/internal/config/package.scala:    
ConfigBuilder("spark.master.useAppNameAsAppId.enabled")
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Manual review.
    
    ![Screenshot 2023-11-16 at 2 48 37 
PM](https://github.com/apache/spark/assets/9700541/1fb90997-22be-4b2a-8db6-08f3db1340d9)
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #43848 from dongjoon-hyun/SPARK-45961.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 docs/spark-standalone.md | 52 ++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 52 insertions(+)

diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index c96839c6e95..ce739cb90b5 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -190,6 +190,58 @@ SPARK_MASTER_OPTS supports the following system properties:
 
 <table class="table table-striped">
 <thead><tr><th>Property Name</th><th>Default</th><th>Meaning</th><th>Since 
Version</th></tr></thead>
+<tr>
+  <td><code>spark.master.ui.port</code></td>
+  <td><code>8080</code></td>
+  <td>
+    Specifies the port number of the Master Web UI endpoint.
+  </td>
+  <td>1.1.0</td>
+</tr>
+<tr>
+  <td><code>spark.master.ui.decommission.allow.mode</code></td>
+  <td><code>LOCAL</code></td>
+  <td>
+    Specifies the behavior of the Master Web UI's /workers/kill endpoint. 
Possible choices
+    are: <code>LOCAL</code> means allow this endpoint from IP's that are local 
to the machine running
+    the Master, <code>DENY</code> means to completely disable this endpoint, 
<code>ALLOW</code> means to allow
+    calling this endpoint from any IP.
+  </td>
+  <td>3.1.0</td>
+</tr>
+<tr>
+  <td><code>spark.master.ui.historyServerUrl</code></td>
+  <td>(None)</td>
+  <td>
+    The URL where Spark history server is running. Please note that this 
assumes
+    that all Spark jobs share the same event log location where the history 
server accesses.
+  </td>
+  <td>4.0.0</td>
+</tr>
+<tr>
+  <td><code>spark.master.rest.enabled</code></td>
+  <td><code>false</code></td>
+  <td>
+    Whether to use the Master REST API endpoint or not.
+  </td>
+  <td>1.3.0</td>
+</tr>
+<tr>
+  <td><code>spark.master.rest.port</code></td>
+  <td><code>6066</code></td>
+  <td>
+    Specifies the port number of the Master REST API endpoint.
+  </td>
+  <td>1.3.0</td>
+</tr>
+<tr>
+  <td><code>spark.master.useAppNameAsAppId.enabled</code></td>
+  <td><code>false</code></td>
+  <td>
+    (Experimental) If true, Spark master uses the user-provided appName for 
appId.
+  </td>
+  <td>4.0.0</td>
+</tr>
 <tr>
   <td><code>spark.deploy.retainedApplications</code></td>
   <td>200</td>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to