This is an automated email from the ASF dual-hosted git repository.
yangjie01 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new b9dd02baed3e [SPARK-54524][BUILD] Fix Connect JDBC driver dependencies
b9dd02baed3e is described below
commit b9dd02baed3e69e983c05d63ffbcab515e3d1c71
Author: Cheng Pan <[email protected]>
AuthorDate: Wed Nov 26 19:48:29 2025 +0800
[SPARK-54524][BUILD] Fix Connect JDBC driver dependencies
### What changes were proposed in this pull request?
The Connect JDBC driver is built on top of the Connect JVM client, so it
should declare the Connect JVM client as the dependency, and Maven will resolve
transitive deps correctly.
Note, many RDBMS products provide a single no-deps JDBC driver jar to
simplify downstream projects' integration, while this is not achieved at the
current stage, and may be considered in the future.
### Why are the changes needed?
Fix Connect JDBC driver dependency management.
For example, it should not pull protobuf libs which are already shaded in
connect-jdbc-client.
https://mvnrepository.com/artifact/org.apache.spark/spark-connect-client-jdbc_2.13/4.1.0-preview4
### Does this PR introduce _any_ user-facing change?
No, the Connect JDBC driver is not released yet.
### How was this patch tested?
Pass GHA, plus manual verifications.
- Checked the effective `pom.xml` which is located at
`sql/connect/client/jdbc/dependency-reduced-pom.xml` after compiling, only one
compile scope dep.
```xml
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-connect-client-jvm_2.13</artifactId>
<version>4.2.0-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
```
- Verified locally by running tests using Maven/SBT
Maven
```
build/mvn -Phive,hive-thriftserver clean install -DskipTests
build/mvn -Phive,hive-thriftserver -pl sql/connect/client/jdbc test
```
SBT
```
build/sbt -Phive,hive-thriftserver "connect-client-jdbc/test"
```
- Verified CLI cases with BeeLine by creating a binary release artifact
using Maven/SBT
Launch a Connect Server first, then
```
dev/make-distribution.sh [--sbt-enabled] -Phive,hive-thriftserver
cd dist
SPARK_CONNECT_BEELINE=1 bin/beeline -u jdbc:sc://localhost:15002 -e
"select 'Hello, Spark Connect!', version() as server_version;"
```
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #53230 from pan3793/SPARK-53484.
Authored-by: Cheng Pan <[email protected]>
Signed-off-by: yangjie01 <[email protected]>
---
sql/connect/client/jdbc/pom.xml | 82 ++---------------------------------------
1 file changed, 3 insertions(+), 79 deletions(-)
diff --git a/sql/connect/client/jdbc/pom.xml b/sql/connect/client/jdbc/pom.xml
index bec3eff7ec56..14365b53b70a 100644
--- a/sql/connect/client/jdbc/pom.xml
+++ b/sql/connect/client/jdbc/pom.xml
@@ -37,77 +37,8 @@
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
- <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
- <version>${project.version}</version>
- </dependency>
- <dependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
- <version>${project.version}</version>
- </dependency>
- <dependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-connect-shims_${scala.binary.version}</artifactId>
- <version>${project.version}</version>
- </dependency>
- <dependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-sketch_${scala.binary.version}</artifactId>
- <version>${project.version}</version>
- </dependency>
- <dependency>
- <groupId>org.scala-lang</groupId>
- <artifactId>scala-compiler</artifactId>
- <scope>compile</scope>
- </dependency>
- <dependency>
- <groupId>org.scala-lang.modules</groupId>
- <artifactId>scala-xml_${scala.binary.version}</artifactId>
- <scope>compile</scope>
- </dependency>
- <!--
- We need to define protobuf here because we need to change the scope of
both from
- provided to compile. If we don't do this we can't shade these libraries.
- -->
- <dependency>
- <groupId>com.google.protobuf</groupId>
- <artifactId>protobuf-java</artifactId>
- <scope>compile</scope>
- </dependency>
- <dependency>
- <groupId>com.google.guava</groupId>
- <artifactId>guava</artifactId>
- <scope>compile</scope>
- </dependency>
- <dependency>
- <groupId>com.google.guava</groupId>
- <artifactId>failureaccess</artifactId>
- <scope>compile</scope>
- </dependency>
- <dependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-tags_${scala.binary.version}</artifactId>
- <type>test-jar</type>
- <scope>test</scope>
- </dependency>
- <dependency>
- <groupId>org.scalacheck</groupId>
- <artifactId>scalacheck_${scala.binary.version}</artifactId>
- <scope>test</scope>
- </dependency>
- <dependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
- <version>${project.version}</version>
- <classifier>tests</classifier>
- <scope>test</scope>
- </dependency>
- <dependency>
- <groupId>org.apache.spark</groupId>
- <artifactId>spark-common-utils_${scala.binary.version}</artifactId>
+ <artifactId>spark-connect-client-jvm_${scala.binary.version}</artifactId>
<version>${project.version}</version>
- <classifier>tests</classifier>
- <scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
@@ -116,13 +47,6 @@
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
- <!-- Use mima to perform the compatibility check -->
- <dependency>
- <groupId>com.typesafe</groupId>
- <artifactId>mima-core_${scala.binary.version}</artifactId>
- <version>${mima.version}</version>
- <scope>test</scope>
- </dependency>
</dependencies>
<build>
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
@@ -142,10 +66,10 @@
<artifactId>maven-shade-plugin</artifactId>
<configuration combine.self = "override">
<shadedArtifactAttached>false</shadedArtifactAttached>
- <promoteTransitiveDependencies>true</promoteTransitiveDependencies>
+ <promoteTransitiveDependencies>false</promoteTransitiveDependencies>
<artifactSet>
<includes>
-
<include>org.apache.spark:spark-connect-client-jdbc_${scala.binary.version}</include>
+ <include>org.spark-project.spark:unused</include>
</includes>
</artifactSet>
<relocations>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]