singhravidutt commented on code in PR #23489:
URL: https://github.com/apache/flink/pull/23489#discussion_r1428174024


##########
flink-filesystems/flink-gs-fs-hadoop/pom.xml:
##########
@@ -188,6 +212,29 @@ under the License.
                                                        <goal>shade</goal>
                                                </goals>
                                                <configuration>
+                                                       <filters>
+                                                               <filter>
+                                                                       
<artifact>org.apache.flink:flink-fs-hadoop-shaded</artifact>
+                                                                       
<excludes>

Review Comment:
   ```
   [WARNING] flink-fs-hadoop-shaded-1.19-SNAPSHOT.jar, guava-32.1.2-jre.jar 
define 1837 overlapping classes: 
   [WARNING]   - com.google.common.annotations.Beta
   [WARNING]   - com.google.common.annotations.GwtCompatible
   [WARNING]   - com.google.common.annotations.GwtIncompatible
   [WARNING]   - com.google.common.annotations.VisibleForTesting
   [WARNING]   - com.google.common.base.Absent
   [WARNING]   - com.google.common.base.AbstractIterator
   [WARNING]   - com.google.common.base.AbstractIterator$1
   [WARNING]   - com.google.common.base.AbstractIterator$State
   [WARNING]   - com.google.common.base.Ascii
   [WARNING]   - com.google.common.base.CaseFormat
   [WARNING]   - 1827 more...
   ```
   I see this while building the package. My interpretation if it is that 
because `flink-fs-hadoop-shaded` is shaded jar AND it's not relocating guava 
classes. Shaded jar contains classes of guava. Hence just excluding guava as 
transitive dependency from module:`flink-fs-hadoop-shaded` is not enough.
   
   `flink-gs-fs-hadoop` will contain two implementation of guava classes i.e. 
`com.google.common.*` one coming from `flink-fs-hadoop-shaded` which will be 
from guava version `v27.1` and other from guava `v32.1.2`. As  
`fun:buildOrThrow` is not available in with `v27.` is causes runtime failure.
   
   Hence we have to either repackage every dependency of 
`flink-fs-hadoop-shaded` and then add as a dependency or exclude the jars 
manually.
   
   What are your thoughts on that?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to