This is an automated email from the ASF dual-hosted git repository.

hvanhovell pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8d78f5b02c25 [DOCS][MINOR] Remove spark-sql-api dependency from 
connect docs
8d78f5b02c25 is described below

commit 8d78f5b02c256423ee125c31bf746a1a15dbcf25
Author: Enrico Minack <[email protected]>
AuthorDate: Mon Sep 16 12:30:04 2024 -0400

    [DOCS][MINOR] Remove spark-sql-api dependency from connect docs
    
    ### What changes were proposed in this pull request?
    Remove `spark-sql-api` dependency from documentation.
    
    ### Why are the changes needed?
    Dependency `spark-connect-client-jvm` is sufficient, as it includes 
(shaded) the `spark-sql-api` package.
    
    In fact, adding the `spark-sql-api` dependency breaks runtime:
    1. transient dependency `io.netty:netty-buffer`, that is also included in 
`spark-connect-client-jvm` (shaded) cannot be found: `NoClassDefFoundError: 
io/netty/buffer/PooledByteBufAllocator`
    2. method `ArrowUtils$.toArrowSchema` provided by `spark-sql-api` cannot be 
found: `NoSuchMethodError: 
'org.sparkproject.org.apache.arrow.vector.types.pojo.Schema 
org.apache.spark.sql.util.ArrowUtils$.toArrowSchema(org.apache.spark.sql.types.StructType,
 java.lang.String, boolean, boolean)'`
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    Manually with minimal maven project.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    No.
    
    Closes #48119 from EnricoMi/docs-connect-jvm-deps.
    
    Authored-by: Enrico Minack <[email protected]>
    Signed-off-by: Herman van Hovell <[email protected]>
---
 docs/spark-connect-overview.md | 1 -
 1 file changed, 1 deletion(-)

diff --git a/docs/spark-connect-overview.md b/docs/spark-connect-overview.md
index b77f71fb695d..1cc409bfbc00 100644
--- a/docs/spark-connect-overview.md
+++ b/docs/spark-connect-overview.md
@@ -335,7 +335,6 @@ Lines with a: 72, lines with b: 39
 To use Spark Connect as part of a Scala application/project, we first need to 
include the right dependencies.
 Using the `sbt` build system as an example, we add the following dependencies 
to the `build.sbt` file:
 {% highlight sbt %}
-libraryDependencies += "org.apache.spark" %% "spark-sql-api" % "3.5.0"
 libraryDependencies += "org.apache.spark" %% "spark-connect-client-jvm" % 
"3.5.0"
 {% endhighlight %}
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to