kbendick commented on issue #2558:
URL: https://github.com/apache/iceberg/issues/2558#issuecomment-908597980


   Thanks for working on this. I wanted to draw attention to the issue 
mentioned above (Update FlinkCatalogFactory to implement the new Factory 
interface). I closed it as basically a duplicate of this one though.
   
   The reason I opened that issue is that several users have reported 
difficulties in using Iceberg with Flink, specifically in configuring Hadoop.
   
   In some environments, users might not have access to setup the class path 
properly for including Hadoop. The new [Context interface that provides access 
to the ClassLoader 
](https://github.com/apache/flink/commit/4497e96b96724aee6637f19f64ffc8ba47a9b0ac#diff-d4dda8460ab068327897de5487517f43186faced18d3a121981e70f6ad9f827aR85)
 might help with that.
   
   The ultimate goal is to decouple the need for hadoop to be in the 
environment when it's not actually needed (for example, when using `S3FileIO`). 
The thought was that upgrading to 1.13 (and the new FlinkCatalogFactory 
interface) will be helpful to provide the easier access to the ClassLoader.
   
   In environments where hadoop `Configuration` isn't used, such as Ververica 
Platform or AWS Kinesis Data Analytics with `S3FileIO`, the hope is to remove 
the need for the Hadoop `Configuration` entirely.
   
   For Ververica, users have reported a workaround but in Kinesis Data 
Analytics, it's a bit more difficult (as far as has been reported).
   
   More information on the Hadoop `Configurable` concern can be found in this 
issue: https://github.com/apache/iceberg/issues/3044
   
   Thanks again @zhangjun0x01!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to