This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.5
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.5 by this push:
     new 22118f9f696c [SPARK-50987][DOCS] Make `spark-connect-overview.md`s 
version strings up-to-date
22118f9f696c is described below

commit 22118f9f696c83066a02e58ec1fdf49930f47d16
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Jan 24 23:24:29 2025 -0800

    [SPARK-50987][DOCS] Make `spark-connect-overview.md`s version strings 
up-to-date
    
    This PR aims to make `spark-connect-overview.md`'s version strings 
up-to-date.
    
    - https://apache.github.io/spark/spark-connect-overview.html
    
    **BEFORE**
    
    <img width="477" alt="Screenshot 2025-01-24 at 11 17 03 PM" 
src="https://github.com/user-attachments/assets/4ee91119-e116-4573-8446-32bf18342ac5";
 />
    
    **AFTER**
    <img width="318" alt="Screenshot 2025-01-24 at 11 17 22 PM" 
src="https://github.com/user-attachments/assets/9e3f9061-6623-440f-8031-5ee85666675c";
 />
    
    **BEFORE**
    <img width="546" alt="Screenshot 2025-01-24 at 11 17 58 PM" 
src="https://github.com/user-attachments/assets/dc3ac80b-a5fc-4ea2-bf1d-4025a6ae204f";
 />
    
    **AFTER**
    <img width="552" alt="Screenshot 2025-01-24 at 11 18 35 PM" 
src="https://github.com/user-attachments/assets/8c5fe8cf-a8b1-4933-a593-3037f356c81a";
 />
    
    **BEFORE**
    <img width="679" alt="Screenshot 2025-01-24 at 11 21 33 PM" 
src="https://github.com/user-attachments/assets/d4d69efe-2fb4-43ea-be13-0d1bbe251b2c";
 />
    
    **AFTER**
    <img width="674" alt="Screenshot 2025-01-24 at 11 22 29 PM" 
src="https://github.com/user-attachments/assets/09a413fe-3659-4bba-b37c-609f2d6f16ba";
 />
    
    This keeps the document up-to-date for Apache Spark 3.5.5/4.0.0+.
    
    Manual review.
    
    No
    
    Closes #49665 from dongjoon-hyun/SPARK-50987.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
    (cherry picked from commit 21f051246101ebbfcf13df41ce85c45b7fe5f41e)
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 docs/spark-connect-overview.md | 14 +++++++-------
 1 file changed, 7 insertions(+), 7 deletions(-)

diff --git a/docs/spark-connect-overview.md b/docs/spark-connect-overview.md
index 9da559c37fde..0283c013ad0f 100644
--- a/docs/spark-connect-overview.md
+++ b/docs/spark-connect-overview.md
@@ -279,11 +279,11 @@ The connection may also be programmatically created using 
_SparkSession#builder_
 
 <div data-lang="python"  markdown="1">
 
-First, install PySpark with `pip install pyspark[connect]==3.5.0` or if 
building a packaged PySpark application/library,
+First, install PySpark with `pip install 
pyspark[connect]=={{site.SPARK_VERSION_SHORT}}` or if building a packaged 
PySpark application/library,
 add it your setup.py file as:
 {% highlight python %}
 install_requires=[
-'pyspark[connect]==3.5.0'
+'pyspark[connect]=={{site.SPARK_VERSION_SHORT}}'
 ]
 {% endhighlight %}
 
@@ -330,8 +330,8 @@ Lines with a: 72, lines with b: 39
 To use Spark Connect as part of a Scala application/project, we first need to 
include the right dependencies.
 Using the `sbt` build system as an example, we add the following dependencies 
to the `build.sbt` file: 
 {% highlight sbt %}
-libraryDependencies += "org.apache.spark" %% "spark-sql-api" % "3.5.0"
-libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % 
"3.5.0"
+libraryDependencies += "org.apache.spark" %% "spark-sql-api" % 
"{{site.SPARK_VERSION_SHORT}}"
+libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % 
"{{site.SPARK_VERSION_SHORT}}"
 {% endhighlight %}
 
 When writing your own code, include the `remote` function with a reference to
@@ -374,9 +374,9 @@ HTTP/2 interface allows for the use of authenticating 
proxies, which makes
 it possible to secure Spark Connect without having to implement authentication
 logic in Spark directly.
 
-# What is supported in Spark 3.4
+# What is supported
 
-**PySpark**: In Spark 3.4, Spark Connect supports most PySpark APIs, including
+**PySpark**: Since Spark 3.4, Spark Connect supports most PySpark APIs, 
including
 [DataFrame](api/python/reference/pyspark.sql/dataframe.html),
 [Functions](api/python/reference/pyspark.sql/functions.html), and
 [Column](api/python/reference/pyspark.sql/column.html). However,
@@ -387,7 +387,7 @@ supported in the [API 
reference](api/python/reference/index.html) documentation.
 Supported APIs are labeled "Supports Spark Connect" so you can check whether 
the
 APIs you are using are available before migrating existing code to Spark 
Connect.
 
-**Scala**: In Spark 3.5, Spark Connect supports most Scala APIs, including
+**Scala**: Since Spark 3.5, Spark Connect supports most Scala APIs, including
 [Dataset](api/scala/org/apache/spark/sql/Dataset.html),
 [functions](api/scala/org/apache/spark/sql/functions$.html),
 [Column](api/scala/org/apache/spark/sql/Column.html),


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to