This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 1cb8250d166 [MINOR][DOCS][CONNECT] Fix docs to run Spark Connect 
locally built
1cb8250d166 is described below

commit 1cb8250d166f5d877f24f6ea097d27d7168ecf15
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Thu Oct 13 16:39:26 2022 +0900

    [MINOR][DOCS][CONNECT] Fix docs to run Spark Connect locally built
    
    ### What changes were proposed in this pull request?
    
    This PR adds some more command examples to run Spark Connect that you built 
locally.
    
    ### Why are the changes needed?
    
    To guide developers to run Spark Connect they built locally.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, dev-only and doc-only.
    
    ### How was this patch tested?
    
    The commands were tested manually in my local.
    
    Closes #38236 from HyukjinKwon/minor-docs-spark-cnnect.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 python/pyspark/sql/connect/README.md | 36 ++++++++++++++++++++++++------------
 1 file changed, 24 insertions(+), 12 deletions(-)

diff --git a/python/pyspark/sql/connect/README.md 
b/python/pyspark/sql/connect/README.md
index bd42142ac78..b9cfb31d13c 100644
--- a/python/pyspark/sql/connect/README.md
+++ b/python/pyspark/sql/connect/README.md
@@ -9,22 +9,34 @@ of Spark. To enable it, you only need to activate the driver 
plugin for Spark Co
 
 ## Build
 
-1. Build Spark as usual per the documentation.
-
-2. Build and package the Spark Connect package
-
-   ```bash
-   ./build/mvn -Phive package
-   ```
+```bash
+./build/mvn -Phive clean package
+```
 
-   or
+or
 
-   ```bash
-   ./build/sbt -Phive package
-   ```
+```bash
+./build/sbt -Phive clean package
+```
    
 ## Run Spark Shell
 
+To run Spark Connect you locally built:
+
+```bash
+# Scala shell
+./bin/spark-shell \
+  --jars `ls connector/connect/target/**/spark-connect*SNAPSHOT.jar | paste 
-sd ',' -` \
+  --conf spark.plugins=org.apache.spark.sql.connect.SparkConnectPlugin
+
+# PySpark shell
+./bin/pyspark \
+  --jars `ls connector/connect/target/**/spark-connect*SNAPSHOT.jar | paste 
-sd ',' -` \
+  --conf spark.plugins=org.apache.spark.sql.connect.SparkConnectPlugin
+```
+
+To use the release version of Spark Connect:
+
 ```bash
 ./bin/spark-shell \
   --packages org.apache.spark:spark-connect_2.12:3.4.0 \
@@ -34,6 +46,6 @@ of Spark. To enable it, you only need to activate the driver 
plugin for Spark Co
 ## Run Tests
 
 ```bash
-./run-tests --testnames 'pyspark.sql.tests.test_connect_basic'
+./python/run-tests --testnames 'pyspark.sql.tests.test_connect_basic'
 ```
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to