LuciferYang commented on code in PR #42441:
URL: https://github.com/apache/spark/pull/42441#discussion_r1290854590
##########
connector/connect/client/jvm/pom.xml:
##########
@@ -46,6 +46,12 @@
</exclusion>
</exclusions>
</dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+
<artifactId>spark-connect-client-jvm-internal_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
Review Comment:
https://github.com/apache/spark/blob/b828ff0bdba8e4fa8506c9d1fdc4a48ec491d52a/connector/connect/client/jvm/pom.xml#L155-L156
I recommend including `connect-client-jvm-internal` in the `include` list of
the `maven-shade-plugin`(like spark-connect-common and spark-common-utils).
This way, at the user level, it would still be sufficient to just depend on
`connect-client-jvm` uber jar
There might be some other modules that need to be included as well, such as
`sql-api`, and so on. WDYT @hvanhovell
##########
connector/connect/client/jvm-internal/pom.xml:
##########
@@ -0,0 +1,235 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one or more
+ ~ contributor license agreements. See the NOTICE file distributed with
+ ~ this work for additional information regarding copyright ownership.
+ ~ The ASF licenses this file to You under the Apache License, Version 2.0
+ ~ (the "License"); you may not use this file except in compliance with
+ ~ the License. You may obtain a copy of the License at
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing, software
+ ~ distributed under the License is distributed on an "AS IS" BASIS,
+ ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ ~ See the License for the specific language governing permissions and
+ ~ limitations under the License.
+ -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <parent>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-parent_2.12</artifactId>
+ <version>4.0.0-SNAPSHOT</version>
+ <relativePath>../../../../pom.xml</relativePath>
+ </parent>
+
+ <artifactId>spark-connect-client-jvm-internal_2.12</artifactId>
+ <packaging>jar</packaging>
+ <name>Spark Project Connect Client Internal</name>
+ <url>https://spark.apache.org/</url>
+ <properties>
+ <sbt.project.name>connect-client-jvm-internal</sbt.project.name>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sketch_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.google.protobuf</groupId>
+ <artifactId>protobuf-java</artifactId>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ <version>${connect.guava.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>failureaccess</artifactId>
+ <version>${guava.failureaccess.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-codec-http2</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-handler-proxy</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-transport-native-unix-common</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.lihaoyi</groupId>
+ <artifactId>ammonite_${scala.version}</artifactId>
+ <version>${ammonite.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <type>test-jar</type>
+ <scope>test</scope>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.scalacheck</groupId>
+ <artifactId>scalacheck_${scala.binary.version}</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.mockito</groupId>
+ <artifactId>mockito-core</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <!-- Use mima to perform the compatibility check -->
+ <dependency>
+ <groupId>com.typesafe</groupId>
Review Comment:
The tool for mima check is still in `connect-client-jvm`, this dependency
can be removed.
##########
connector/connect/client/jvm-internal/pom.xml:
##########
@@ -0,0 +1,235 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one or more
+ ~ contributor license agreements. See the NOTICE file distributed with
+ ~ this work for additional information regarding copyright ownership.
+ ~ The ASF licenses this file to You under the Apache License, Version 2.0
+ ~ (the "License"); you may not use this file except in compliance with
+ ~ the License. You may obtain a copy of the License at
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing, software
+ ~ distributed under the License is distributed on an "AS IS" BASIS,
+ ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ ~ See the License for the specific language governing permissions and
+ ~ limitations under the License.
+ -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <parent>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-parent_2.12</artifactId>
+ <version>4.0.0-SNAPSHOT</version>
+ <relativePath>../../../../pom.xml</relativePath>
+ </parent>
+
+ <artifactId>spark-connect-client-jvm-internal_2.12</artifactId>
+ <packaging>jar</packaging>
+ <name>Spark Project Connect Client Internal</name>
+ <url>https://spark.apache.org/</url>
+ <properties>
+ <sbt.project.name>connect-client-jvm-internal</sbt.project.name>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sketch_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.google.protobuf</groupId>
+ <artifactId>protobuf-java</artifactId>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ <version>${connect.guava.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>failureaccess</artifactId>
+ <version>${guava.failureaccess.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-codec-http2</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-handler-proxy</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-transport-native-unix-common</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.lihaoyi</groupId>
+ <artifactId>ammonite_${scala.version}</artifactId>
+ <version>${ammonite.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <type>test-jar</type>
+ <scope>test</scope>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.scalacheck</groupId>
+ <artifactId>scalacheck_${scala.binary.version}</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.mockito</groupId>
+ <artifactId>mockito-core</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <!-- Use mima to perform the compatibility check -->
+ <dependency>
+ <groupId>com.typesafe</groupId>
+ <artifactId>mima-core_${scala.binary.version}</artifactId>
+ <version>${mima.version}</version>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+ <build>
+
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
+
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
+ <plugins>
+ <!-- Shade all Guava / Protobuf / Netty dependencies of this build -->
+ <!-- TODO (SPARK-42449): Ensure shading rules are handled correctly in
`native-image.properties` and support GraalVM -->
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-shade-plugin</artifactId>
+ <configuration>
+ <shadedArtifactAttached>false</shadedArtifactAttached>
+ <artifactSet>
+ <includes>
+ <include>com.google.android:*</include>
+ <include>com.google.api.grpc:*</include>
+ <include>com.google.code.findbugs:*</include>
+ <include>com.google.code.gson:*</include>
+ <include>com.google.errorprone:*</include>
+ <include>com.google.guava:*</include>
+ <include>com.google.j2objc:*</include>
+ <include>com.google.protobuf:*</include>
+ <include>io.grpc:*</include>
+ <include>io.netty:*</include>
+ <include>io.perfmark:*</include>
+ <include>org.codehaus.mojo:*</include>
+ <include>org.checkerframework:*</include>
+
<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include>
+
<include>org.apache.spark:spark-common-utils_${scala.binary.version}</include>
+ </includes>
+ </artifactSet>
+ <relocations>
+ <relocation>
+ <pattern>io.grpc</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.grpc</shadedPattern>
+ <includes>
+ <include>io.grpc.**</include>
+ </includes>
+ </relocation>
+ <relocation>
+ <pattern>com.google</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.com.google</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>io.netty</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.netty</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>org.checkerframework</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.org.checkerframework</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>javax.annotation</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.javax.annotation</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>io.perfmark</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.perfmark</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>org.codehaus</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.org.codehaus</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>android.annotation</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.android.annotation</shadedPattern>
+ </relocation>
+ </relocations>
+ <!--SPARK-42228: Add `ServicesResourceTransformer` to relocation
class names in META-INF/services for grpc-->
+ <transformers>
+ <transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+ </transformers>
+ </configuration>
+ </plugin>
+ <plugin>
Review Comment:
The configuration of the ‘maven-jar-plugin’ is a bit superfluous too at the
moment
##########
connector/connect/client/jvm-internal/pom.xml:
##########
@@ -0,0 +1,235 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one or more
+ ~ contributor license agreements. See the NOTICE file distributed with
+ ~ this work for additional information regarding copyright ownership.
+ ~ The ASF licenses this file to You under the Apache License, Version 2.0
+ ~ (the "License"); you may not use this file except in compliance with
+ ~ the License. You may obtain a copy of the License at
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing, software
+ ~ distributed under the License is distributed on an "AS IS" BASIS,
+ ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ ~ See the License for the specific language governing permissions and
+ ~ limitations under the License.
+ -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <parent>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-parent_2.12</artifactId>
+ <version>4.0.0-SNAPSHOT</version>
+ <relativePath>../../../../pom.xml</relativePath>
+ </parent>
+
+ <artifactId>spark-connect-client-jvm-internal_2.12</artifactId>
+ <packaging>jar</packaging>
+ <name>Spark Project Connect Client Internal</name>
+ <url>https://spark.apache.org/</url>
+ <properties>
+ <sbt.project.name>connect-client-jvm-internal</sbt.project.name>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sketch_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.google.protobuf</groupId>
+ <artifactId>protobuf-java</artifactId>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ <version>${connect.guava.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>failureaccess</artifactId>
+ <version>${guava.failureaccess.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-codec-http2</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-handler-proxy</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-transport-native-unix-common</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.lihaoyi</groupId>
+ <artifactId>ammonite_${scala.version}</artifactId>
+ <version>${ammonite.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <type>test-jar</type>
+ <scope>test</scope>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.scalacheck</groupId>
+ <artifactId>scalacheck_${scala.binary.version}</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.mockito</groupId>
+ <artifactId>mockito-core</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <!-- Use mima to perform the compatibility check -->
+ <dependency>
+ <groupId>com.typesafe</groupId>
+ <artifactId>mima-core_${scala.binary.version}</artifactId>
+ <version>${mima.version}</version>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+ <build>
+
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
+
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
+ <plugins>
+ <!-- Shade all Guava / Protobuf / Netty dependencies of this build -->
+ <!-- TODO (SPARK-42449): Ensure shading rules are handled correctly in
`native-image.properties` and support GraalVM -->
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-shade-plugin</artifactId>
+ <configuration>
+ <shadedArtifactAttached>false</shadedArtifactAttached>
+ <artifactSet>
+ <includes>
+ <include>com.google.android:*</include>
+ <include>com.google.api.grpc:*</include>
+ <include>com.google.code.findbugs:*</include>
+ <include>com.google.code.gson:*</include>
+ <include>com.google.errorprone:*</include>
+ <include>com.google.guava:*</include>
+ <include>com.google.j2objc:*</include>
+ <include>com.google.protobuf:*</include>
+ <include>io.grpc:*</include>
+ <include>io.netty:*</include>
+ <include>io.perfmark:*</include>
+ <include>org.codehaus.mojo:*</include>
+ <include>org.checkerframework:*</include>
+
<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include>
+
<include>org.apache.spark:spark-common-utils_${scala.binary.version}</include>
+ </includes>
+ </artifactSet>
+ <relocations>
+ <relocation>
+ <pattern>io.grpc</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.grpc</shadedPattern>
+ <includes>
+ <include>io.grpc.**</include>
+ </includes>
+ </relocation>
+ <relocation>
+ <pattern>com.google</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.com.google</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>io.netty</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.netty</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>org.checkerframework</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.org.checkerframework</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>javax.annotation</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.javax.annotation</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>io.perfmark</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.perfmark</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>org.codehaus</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.org.codehaus</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>android.annotation</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.android.annotation</shadedPattern>
+ </relocation>
+ </relocations>
+ <!--SPARK-42228: Add `ServicesResourceTransformer` to relocation
class names in META-INF/services for grpc-->
+ <transformers>
+ <transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+ </transformers>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>prepare-test-jar</id>
+ <phase>test-compile</phase>
+ <goals>
+ <goal>test-jar</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>org.codehaus.mojo</groupId>
+ <artifactId>build-helper-maven-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>add-sources</id>
+ <phase>generate-sources</phase>
+ <goals>
+ <goal>add-source</goal>
+ </goals>
+ <configuration>
+ <sources>
+ <source>src/main/scala-${scala.binary.version}</source>
+ </sources>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+</project>
Review Comment:
Needs an empty line :)
##########
connector/connect/client/jvm-internal/pom.xml:
##########
@@ -0,0 +1,235 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one or more
+ ~ contributor license agreements. See the NOTICE file distributed with
+ ~ this work for additional information regarding copyright ownership.
+ ~ The ASF licenses this file to You under the Apache License, Version 2.0
+ ~ (the "License"); you may not use this file except in compliance with
+ ~ the License. You may obtain a copy of the License at
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing, software
+ ~ distributed under the License is distributed on an "AS IS" BASIS,
+ ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ ~ See the License for the specific language governing permissions and
+ ~ limitations under the License.
+ -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <parent>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-parent_2.12</artifactId>
+ <version>4.0.0-SNAPSHOT</version>
+ <relativePath>../../../../pom.xml</relativePath>
+ </parent>
+
+ <artifactId>spark-connect-client-jvm-internal_2.12</artifactId>
+ <packaging>jar</packaging>
+ <name>Spark Project Connect Client Internal</name>
+ <url>https://spark.apache.org/</url>
+ <properties>
+ <sbt.project.name>connect-client-jvm-internal</sbt.project.name>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sketch_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.google.protobuf</groupId>
+ <artifactId>protobuf-java</artifactId>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ <version>${connect.guava.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>failureaccess</artifactId>
+ <version>${guava.failureaccess.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-codec-http2</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-handler-proxy</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-transport-native-unix-common</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.lihaoyi</groupId>
+ <artifactId>ammonite_${scala.version}</artifactId>
+ <version>${ammonite.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <type>test-jar</type>
+ <scope>test</scope>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.scalacheck</groupId>
+ <artifactId>scalacheck_${scala.binary.version}</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.mockito</groupId>
+ <artifactId>mockito-core</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <!-- Use mima to perform the compatibility check -->
+ <dependency>
+ <groupId>com.typesafe</groupId>
+ <artifactId>mima-core_${scala.binary.version}</artifactId>
+ <version>${mima.version}</version>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+ <build>
+
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
+
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
+ <plugins>
+ <!-- Shade all Guava / Protobuf / Netty dependencies of this build -->
+ <!-- TODO (SPARK-42449): Ensure shading rules are handled correctly in
`native-image.properties` and support GraalVM -->
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-shade-plugin</artifactId>
+ <configuration>
+ <shadedArtifactAttached>false</shadedArtifactAttached>
+ <artifactSet>
+ <includes>
+ <include>com.google.android:*</include>
+ <include>com.google.api.grpc:*</include>
+ <include>com.google.code.findbugs:*</include>
+ <include>com.google.code.gson:*</include>
+ <include>com.google.errorprone:*</include>
+ <include>com.google.guava:*</include>
+ <include>com.google.j2objc:*</include>
+ <include>com.google.protobuf:*</include>
+ <include>io.grpc:*</include>
+ <include>io.netty:*</include>
+ <include>io.perfmark:*</include>
+ <include>org.codehaus.mojo:*</include>
+ <include>org.checkerframework:*</include>
+
<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include>
+
<include>org.apache.spark:spark-common-utils_${scala.binary.version}</include>
+ </includes>
+ </artifactSet>
+ <relocations>
+ <relocation>
+ <pattern>io.grpc</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.grpc</shadedPattern>
+ <includes>
+ <include>io.grpc.**</include>
+ </includes>
+ </relocation>
+ <relocation>
+ <pattern>com.google</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.com.google</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>io.netty</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.netty</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>org.checkerframework</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.org.checkerframework</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>javax.annotation</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.javax.annotation</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>io.perfmark</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.io.perfmark</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>org.codehaus</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.org.codehaus</shadedPattern>
+ </relocation>
+ <relocation>
+ <pattern>android.annotation</pattern>
+
<shadedPattern>${spark.shade.packageName}.connect.client.android.annotation</shadedPattern>
+ </relocation>
+ </relocations>
+ <!--SPARK-42228: Add `ServicesResourceTransformer` to relocation
class names in META-INF/services for grpc-->
+ <transformers>
+ <transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+ </transformers>
+ </configuration>
+ </plugin>
+ <plugin>
Review Comment:
The configuration of the ‘maven-jar-plugin’ is a bit superfluous too at the
moment
##########
project/SparkBuild.scala:
##########
@@ -460,6 +463,7 @@ object SparkBuild extends PomBuild {
enable(SparkConnectCommon.settings)(connectCommon)
enable(SparkConnect.settings)(connect)
enable(SparkConnectClient.settings)(connectClient)
+ enable(SparkConnectClient.settings)(connectClientInternal)
Review Comment:
I would still suggest defining a separate setting object for
`connectClientInternal`, as many configurations in `SparkConnectClient` may be
useless for the `connectClientInternal` module, such as `buildTestDeps`, `grpc
plugin`, and so on.
##########
connector/connect/client/jvm-internal/pom.xml:
##########
@@ -0,0 +1,235 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one or more
+ ~ contributor license agreements. See the NOTICE file distributed with
+ ~ this work for additional information regarding copyright ownership.
+ ~ The ASF licenses this file to You under the Apache License, Version 2.0
+ ~ (the "License"); you may not use this file except in compliance with
+ ~ the License. You may obtain a copy of the License at
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing, software
+ ~ distributed under the License is distributed on an "AS IS" BASIS,
+ ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ ~ See the License for the specific language governing permissions and
+ ~ limitations under the License.
+ -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <parent>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-parent_2.12</artifactId>
+ <version>4.0.0-SNAPSHOT</version>
+ <relativePath>../../../../pom.xml</relativePath>
+ </parent>
+
+ <artifactId>spark-connect-client-jvm-internal_2.12</artifactId>
+ <packaging>jar</packaging>
+ <name>Spark Project Connect Client Internal</name>
+ <url>https://spark.apache.org/</url>
+ <properties>
+ <sbt.project.name>connect-client-jvm-internal</sbt.project.name>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sketch_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.google.protobuf</groupId>
+ <artifactId>protobuf-java</artifactId>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ <version>${connect.guava.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>failureaccess</artifactId>
+ <version>${guava.failureaccess.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-codec-http2</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-handler-proxy</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-transport-native-unix-common</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.lihaoyi</groupId>
+ <artifactId>ammonite_${scala.version}</artifactId>
+ <version>${ammonite.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
Review Comment:
Since this module doesn't have test codes yet, all these test-scope
dependencies can be deleted for the time being. They can be added again when
there's a real need.
##########
connector/connect/client/jvm-internal/pom.xml:
##########
@@ -0,0 +1,235 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+ ~ Licensed to the Apache Software Foundation (ASF) under one or more
+ ~ contributor license agreements. See the NOTICE file distributed with
+ ~ this work for additional information regarding copyright ownership.
+ ~ The ASF licenses this file to You under the Apache License, Version 2.0
+ ~ (the "License"); you may not use this file except in compliance with
+ ~ the License. You may obtain a copy of the License at
+ ~
+ ~ http://www.apache.org/licenses/LICENSE-2.0
+ ~
+ ~ Unless required by applicable law or agreed to in writing, software
+ ~ distributed under the License is distributed on an "AS IS" BASIS,
+ ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ ~ See the License for the specific language governing permissions and
+ ~ limitations under the License.
+ -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+ <parent>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-parent_2.12</artifactId>
+ <version>4.0.0-SNAPSHOT</version>
+ <relativePath>../../../../pom.xml</relativePath>
+ </parent>
+
+ <artifactId>spark-connect-client-jvm-internal_2.12</artifactId>
+ <packaging>jar</packaging>
+ <name>Spark Project Connect Client Internal</name>
+ <url>https://spark.apache.org/</url>
+ <properties>
+ <sbt.project.name>connect-client-jvm-internal</sbt.project.name>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sql-api_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-sketch_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.google.protobuf</groupId>
+ <artifactId>protobuf-java</artifactId>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ <version>${connect.guava.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>com.google.guava</groupId>
+ <artifactId>failureaccess</artifactId>
+ <version>${guava.failureaccess.version}</version>
+ <scope>compile</scope>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-codec-http2</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-handler-proxy</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>io.netty</groupId>
+ <artifactId>netty-transport-native-unix-common</artifactId>
+ <version>${netty.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.lihaoyi</groupId>
+ <artifactId>ammonite_${scala.version}</artifactId>
+ <version>${ammonite.version}</version>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.apache.spark</groupId>
+ <artifactId>spark-connect-common_${scala.binary.version}</artifactId>
+ <version>${project.version}</version>
+ <type>test-jar</type>
+ <scope>test</scope>
+ <exclusions>
+ <exclusion>
+ <groupId>com.google.guava</groupId>
+ <artifactId>guava</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <dependency>
+ <groupId>org.scalacheck</groupId>
+ <artifactId>scalacheck_${scala.binary.version}</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.mockito</groupId>
+ <artifactId>mockito-core</artifactId>
+ <scope>test</scope>
+ </dependency>
+ <!-- Use mima to perform the compatibility check -->
+ <dependency>
+ <groupId>com.typesafe</groupId>
+ <artifactId>mima-core_${scala.binary.version}</artifactId>
+ <version>${mima.version}</version>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+ <build>
+
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
+
<testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
+ <plugins>
+ <!-- Shade all Guava / Protobuf / Netty dependencies of this build -->
+ <!-- TODO (SPARK-42449): Ensure shading rules are handled correctly in
`native-image.properties` and support GraalVM -->
+ <plugin>
Review Comment:
Are all the shaded rules necessary? Or is it possible not to configure
shaded?
I have performed a local Maven build and used Maven to test the added cases.
When retaining the `maven-shade-plugin`, both the build of the client module
and the newly added test cases in the server module would fail. However, upon
the removal of the related configuration of `maven-shade-plugin`, everything
succeeded.
##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/service/SparkConnectService.scala:
##########
@@ -263,6 +263,7 @@ object SparkConnectService extends Logging {
private type SessionCacheKey = (String, String)
private[connect] var server: Server = _
+ private[connect] var service: SparkConnectService = _
Review Comment:
hmm, If it's just for testing, it doesn't seem necessary. After removing
these changes, the tests will also be successful.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]