rdblue commented on a change in pull request #1618:
URL: https://github.com/apache/iceberg/pull/1618#discussion_r518185671



##########
File path: flink/src/main/java/org/apache/iceberg/flink/CatalogLoader.java
##########
@@ -45,12 +45,22 @@
    */
   Catalog loadCatalog();
 
-  static CatalogLoader hadoop(String name, Configuration hadoopConf, String 
warehouseLocation) {
-    return new HadoopCatalogLoader(name, hadoopConf, warehouseLocation);
+  static CatalogLoader hadoop(
+      String name,
+      Configuration hadoopConf,
+      String warehouseLocation,
+      Map<String, String> properties) {
+    return new HadoopCatalogLoader(name, hadoopConf, warehouseLocation, 
properties);

Review comment:
       For Flink, I think we should pass the warehouse location and other 
config through properties. Right now, we pull it out in `FlinkCatalogFactory`, 
but it doesn't make much sense to pull out only some properties.
   
   How about leaving the current `CatalogLoader.hadoop` and 
`CatalogLoader.hive` as they are and adding variants like 
`CatalogLoader.hadoop(String name, Map<String, String> properties, 
Configuration conf)`? Then we can pull the properties out using standard config 
properties that we put in `CatalogProperties` in the loader classes.
   
   FYI @JingsongLi and @openinx: we're improving how we configure catalogs and 
allowing some parts to be loaded dynamically. The main change here is passing 
properties to the catalog as a map.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to