This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-4.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.0 by this push:
     new ec93ac479abf [SPARK-50987][DOCS] Make `spark-connect-overview.md`s 
version strings up-to-date
ec93ac479abf is described below

commit ec93ac479abf1e03d77b388015be524a20a92e8a
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Jan 24 23:24:29 2025 -0800

    [SPARK-50987][DOCS] Make `spark-connect-overview.md`s version strings 
up-to-date
    
    ### What changes were proposed in this pull request?
    
    This PR aims to make `spark-connect-overview.md`'s version strings 
up-to-date.
    
    ### Why are the changes needed?
    
    - https://apache.github.io/spark/spark-connect-overview.html
    
    **BEFORE**
    
    <img width="477" alt="Screenshot 2025-01-24 at 11 17 03 PM" 
src="https://github.com/user-attachments/assets/4ee91119-e116-4573-8446-32bf18342ac5";
 />
    
    **AFTER**
    <img width="318" alt="Screenshot 2025-01-24 at 11 17 22 PM" 
src="https://github.com/user-attachments/assets/9e3f9061-6623-440f-8031-5ee85666675c";
 />
    
    **BEFORE**
    <img width="546" alt="Screenshot 2025-01-24 at 11 17 58 PM" 
src="https://github.com/user-attachments/assets/dc3ac80b-a5fc-4ea2-bf1d-4025a6ae204f";
 />
    
    **AFTER**
    <img width="552" alt="Screenshot 2025-01-24 at 11 18 35 PM" 
src="https://github.com/user-attachments/assets/8c5fe8cf-a8b1-4933-a593-3037f356c81a";
 />
    
    **BEFORE**
    <img width="679" alt="Screenshot 2025-01-24 at 11 21 33 PM" 
src="https://github.com/user-attachments/assets/d4d69efe-2fb4-43ea-be13-0d1bbe251b2c";
 />
    
    **AFTER**
    <img width="674" alt="Screenshot 2025-01-24 at 11 22 29 PM" 
src="https://github.com/user-attachments/assets/09a413fe-3659-4bba-b37c-609f2d6f16ba";
 />
    
    ### Does this PR introduce _any_ user-facing change?
    
    This keeps the document up-to-date for Apache Spark 3.5.5/4.0.0+.
    
    ### How was this patch tested?
    
    Manual review.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No
    
    Closes #49665 from dongjoon-hyun/SPARK-50987.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit 21f051246101ebbfcf13df41ce85c45b7fe5f41e)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 docs/spark-connect-overview.md | 12 ++++++------
 1 file changed, 6 insertions(+), 6 deletions(-)

diff --git a/docs/spark-connect-overview.md b/docs/spark-connect-overview.md
index 723bae9fd9be..2b4c3ee4f77f 100644
--- a/docs/spark-connect-overview.md
+++ b/docs/spark-connect-overview.md
@@ -284,11 +284,11 @@ The connection may also be programmatically created using 
_SparkSession#builder_
 
 <div data-lang="python"  markdown="1">
 
-First, install PySpark with `pip install pyspark[connect]==3.5.0` or if 
building a packaged PySpark application/library,
+First, install PySpark with `pip install 
pyspark[connect]=={{site.SPARK_VERSION_SHORT}}` or if building a packaged 
PySpark application/library,
 add it your setup.py file as:
 {% highlight python %}
 install_requires=[
-'pyspark[connect]==3.5.0'
+'pyspark[connect]=={{site.SPARK_VERSION_SHORT}}'
 ]
 {% endhighlight %}
 
@@ -335,7 +335,7 @@ Lines with a: 72, lines with b: 39
 To use Spark Connect as part of a Scala application/project, we first need to 
include the right dependencies.
 Using the `sbt` build system as an example, we add the following dependencies 
to the `build.sbt` file:
 {% highlight sbt %}
-libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % 
"3.5.0"
+libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % 
"{{site.SPARK_VERSION_SHORT}}"
 {% endhighlight %}
 
 When writing your own code, include the `remote` function with a reference to
@@ -380,9 +380,9 @@ HTTP/2 interface allows for the use of authenticating 
proxies, which makes
 it possible to secure Spark Connect without having to implement authentication
 logic in Spark directly.
 
-# What is supported in Spark 3.4
+# What is supported
 
-**PySpark**: In Spark 3.4, Spark Connect supports most PySpark APIs, including
+**PySpark**: Since Spark 3.4, Spark Connect supports most PySpark APIs, 
including
 [DataFrame](api/python/reference/pyspark.sql/dataframe.html),
 [Functions](api/python/reference/pyspark.sql/functions.html), and
 [Column](api/python/reference/pyspark.sql/column.html). However,
@@ -393,7 +393,7 @@ supported in the [API 
reference](api/python/reference/index.html) documentation.
 Supported APIs are labeled "Supports Spark Connect" so you can check whether 
the
 APIs you are using are available before migrating existing code to Spark 
Connect.
 
-**Scala**: In Spark 3.5, Spark Connect supports most Scala APIs, including
+**Scala**: Since Spark 3.5, Spark Connect supports most Scala APIs, including
 [Dataset](api/scala/org/apache/spark/sql/Dataset.html),
 [functions](api/scala/org/apache/spark/sql/functions$.html),
 [Column](api/scala/org/apache/spark/sql/Column.html),


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to