jackye1995 commented on a change in pull request #1640:
URL: https://github.com/apache/iceberg/pull/1640#discussion_r513121949
##########
File path: flink/src/main/java/org/apache/iceberg/flink/CatalogLoader.java
##########
@@ -105,4 +113,54 @@ public String toString() {
.toString();
}
}
+
+ class CustomCatalogLoader implements CatalogLoader {
+
+ private final SerializableConfiguration hadoopConf;
+ private final Map<String, String> properties;
+ private final String name;
+ private final String impl;
+
+ private CustomCatalogLoader(
+ String name,
+ Map<String, String> properties,
+ Configuration conf,
+ String impl) {
+ this.hadoopConf = new SerializableConfiguration(conf);
+ this.properties = new HashMap<>(properties); // use hashmap for
serialization
+ this.name = name;
+ this.impl = Preconditions.checkNotNull(impl,
+ "Cannot initialize custom Catalog because impl property is not set");
+ }
+
+ @Override
+ public Catalog loadCatalog() {
+ DynConstructors.Ctor<Catalog> ctor;
+ try {
+ ctor = DynConstructors.builder(Catalog.class)
+ .impl(impl, Map.class, Configuration.class) // take in flink
properties and hadoop configs
+ .impl(impl) // fall back to no-arg constructor
Review comment:
Yes I am also thinking about this issue. Another way I am considering is
to use a constructor that only takes a string map. Becasue Hadoop configuration
implements the `Iterable<Map.Entry<String, String>>` interface, we can merge
Spark or Flink properties with Hadoop configurations together and pass into the
constructor in a single map. We can use a wrapper class for the merged map to
ensure the names of those properties and config keys do not conflict.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]