ngk2009 commented on issue #4297:
URL: https://github.com/apache/hudi/issues/4297#issuecomment-993232636


   > > > 
   > > 
   > > 
   > > > Flink use its own plugin to support filesystems other than HDFS. Hudi 
adapter to different DFS by extending the `FileSystem` interface directly.
   > > 
   > > 
   > > How to solve it?thanks
   > 
   > Hudi did depend on the hadoop `FileSystem` interface, what we need to do 
is adding the aws s3 `FileSystem` impl codes in the classpath, and it's 
specific configuration should also be configured in hadoop `Configuration`, you 
can reference the `StreamerUtil.getHadoopConf` how we fetch the hadoop 
configuration in the flink pipeline.
   
   hi,I found you changed the class loader in file 
org.apache.hudi.sink.StreamWriteOperatorCoordinator:Thread.currentThread().setContextClassLoader(getClass().getClassLoader()),Will
 the classpath in the hudi-bundle jar be used? Will the classes under flink/lib 
fail to load and cause the S3 schema not to be found? so which "the aws s3 
`FileSystem` impl codes in the classpath" package to be add in flink lib?
   current in flink lib jars are:
   hadoop-aws-3.0.0-cdh6.3.0.jar
   hadoop-common-3.0.0-cdh6.3.0.jar
   hadoop-hdfs-client-3.0.0-cdh6.3.0.jar
   hadoop-mapreduce-client-core-3.0.0-cdh6.3.0.jar
   flink-s3-fs-hadoop-1.13.3.jar
   aws-java-sdk-s3-1.11.836.jar
   hudi-aws-0.10.0.jar
   hudi-flink-bundle_2.11-0.10.0.jar
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to