This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new 96517aade98 [SPARK-42276][BUILD][CONNECT] Add 
`ServicesResourceTransformer` rule to connect server module shade configuration
96517aade98 is described below

commit 96517aade9843a9b44c01b078cd114ffe202b576
Author: yangjie01 <[email protected]>
AuthorDate: Fri Feb 10 11:58:19 2023 +0900

    [SPARK-42276][BUILD][CONNECT] Add `ServicesResourceTransformer` rule to 
connect server module shade configuration
    
    ### What changes were proposed in this pull request?
    This pr aims add `ServicesResourceTransformer` rule to connect server 
module shade configuration to make sure `grpc.ManagedChannelProvider` and 
`grpc.ServerProvider` can be used in server side.
    
    ### Why are the changes needed?
    Keep `grpc.ManagedChannelProvider` and `grpc.ServerProvider` and other spi  
usable after grpc being shaded, sbt doesn't need to be fixed because 
`sbt-assembly` does this by default.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    - Pass Github Actions
    - Manual test, build a spark client and do as follows:
    
    ```
    bin/spark-shell --jars spark-connect_2.12-3.5.0-SNAPSHOT.jar 
--driver-class-path spark-connect_2.12-3.5.0-SNAPSHOT.jar --conf 
spark.plugins=org.apache.spark.sql.connect.SparkConnectPlugin --conf 
spark.connect.grpc.binding.port=15102
    ```
    
    then run some code in spark-shell
    
    Before
    
    ```scala
    23/02/01 20:44:58 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
    Spark context Web UI available at http://localhost:4040
    Spark context available as 'sc' (master = local[*], app id = 
local-1675255501816).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.5.0-SNAPSHOT
          /_/
    
    Using Scala version 2.12.17 (OpenJDK 64-Bit Server VM, Java 11.0.17)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> org.sparkproject.connect.grpc.ServerProvider.provider
    
org.sparkproject.connect.grpc.ManagedChannelProvider$ProviderNotFoundException: 
No functional server found. Try adding a dependency on the grpc-netty or 
grpc-netty-shaded artifact
      at 
org.sparkproject.connect.grpc.ServerProvider.provider(ServerProvider.java:44)
      ... 47 elided
    
    scala> org.sparkproject.connect.grpc.ManagedChannelProvider.provider
    
org.sparkproject.connect.grpc.ManagedChannelProvider$ProviderNotFoundException: 
No functional channel service provider found. Try adding a dependency on the 
grpc-okhttp, grpc-netty, or grpc-netty-shaded artifact
      at 
org.sparkproject.connect.grpc.ManagedChannelProvider.provider(ManagedChannelProvider.java:45)
      ... 47 elided
    
    ```
    
    After
    
    ```scala
    23/02/01 21:00:13 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
    Spark context Web UI available at http://localhost:4040
    Spark context available as 'sc' (master = local[*], app id = 
local-1675256417224).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.5.0-SNAPSHOT
          /_/
    
    Using Scala version 2.12.17 (OpenJDK 64-Bit Server VM, Java 11.0.17)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> org.sparkproject.connect.grpc.ManagedChannelProvider.provider
    res0: org.sparkproject.connect.grpc.ManagedChannelProvider = 
org.sparkproject.connect.grpc.netty.NettyChannelProvider68aa505b
    
    scala> org.sparkproject.connect.grpc.ServerProvider.provider
    res2: org.sparkproject.connect.grpc.ServerProvider = 
org.sparkproject.connect.grpc.netty.NettyServerProvider4a5d8ae4
    
    ```
    
    Closes #39848 from LuciferYang/SPARK-42276.
    
    Lead-authored-by: yangjie01 <[email protected]>
    Co-authored-by: YangJie <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
    (cherry picked from commit af50b47e12040f86c4f81ff84407ad820cb252c1)
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 connector/connect/server/pom.xml | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/connector/connect/server/pom.xml b/connector/connect/server/pom.xml
index 4c21d700989..ab7d83652dc 100644
--- a/connector/connect/server/pom.xml
+++ b/connector/connect/server/pom.xml
@@ -349,6 +349,9 @@
               
<shadedPattern>${spark.shade.packageName}.connect.google_protos.type</shadedPattern>
             </relocation>
           </relocations>
+          <transformers>
+            <transformer 
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+          </transformers>
         </configuration>
       </plugin>
     </plugins>


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to