This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 3b3e3011326 [SPARK-42730][CONNECT][DOCS] Update Spark Standalone Mode 
page
3b3e3011326 is described below

commit 3b3e30113262455553a0bb7d668b2a7d9a23a05d
Author: pegasas <616672...@qq.com>
AuthorDate: Thu Aug 3 12:15:19 2023 +0900

    [SPARK-42730][CONNECT][DOCS] Update Spark Standalone Mode page
    
    ### What changes were proposed in this pull request?
    
    [SPARK-42730][CONNECT][DOCS]
    Add start-connect-server.sh/stop-connect-server.sh to this list and cover 
Spark Connect sessions - other changes needed here.
    
    ### Why are the changes needed?
    
    [SPARK-42730][CONNECT][DOCS]
    Add start-connect-server.sh/stop-connect-server.sh to this list and cover 
Spark Connect sessions - other changes needed here..
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Spark Document related patch tested.
    
    Closes #42307 from pegasas/doc.
    
    Authored-by: pegasas <616672...@qq.com>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 docs/spark-standalone.md | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index d47ff3987f9..3e87edad0aa 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -104,10 +104,12 @@ Once you've set up this file, you can launch or stop your 
cluster with the follo
 - `sbin/start-master.sh` - Starts a master instance on the machine the script 
is executed on.
 - `sbin/start-workers.sh` - Starts a worker instance on each machine specified 
in the `conf/workers` file.
 - `sbin/start-worker.sh` - Starts a worker instance on the machine the script 
is executed on.
+- `sbin/start-connect-server.sh` - Starts a Spark Connect server on the 
machine the script is executed on.
 - `sbin/start-all.sh` - Starts both a master and a number of workers as 
described above.
 - `sbin/stop-master.sh` - Stops the master that was started via the 
`sbin/start-master.sh` script.
 - `sbin/stop-worker.sh` - Stops all worker instances on the machine the script 
is executed on.
 - `sbin/stop-workers.sh` - Stops all worker instances on the machines 
specified in the `conf/workers` file.
+- `sbin/stop-connect-server.sh` - Stops all Spark Connect server instances on 
the machine the script is executed on.
 - `sbin/stop-all.sh` - Stops both the master and the workers as described 
above.
 
 Note that these scripts must be executed on the machine you want to run the 
Spark master on, not your local machine.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to