rdblue commented on a change in pull request #1640:
URL: https://github.com/apache/iceberg/pull/1640#discussion_r513691007
##########
File path: flink/src/main/java/org/apache/iceberg/flink/CatalogLoader.java
##########
@@ -105,4 +113,54 @@ public String toString() {
.toString();
}
}
+
+ class CustomCatalogLoader implements CatalogLoader {
+
+ private final SerializableConfiguration hadoopConf;
+ private final Map<String, String> properties;
+ private final String name;
+ private final String impl;
+
+ private CustomCatalogLoader(
+ String name,
+ Map<String, String> properties,
+ Configuration conf,
+ String impl) {
+ this.hadoopConf = new SerializableConfiguration(conf);
+ this.properties = new HashMap<>(properties); // use hashmap for
serialization
+ this.name = name;
+ this.impl = Preconditions.checkNotNull(impl,
+ "Cannot initialize custom Catalog because impl property is not set");
+ }
+
+ @Override
+ public Catalog loadCatalog() {
+ DynConstructors.Ctor<Catalog> ctor;
+ try {
+ ctor = DynConstructors.builder(Catalog.class)
+ .impl(impl, Map.class, Configuration.class) // take in flink
properties and hadoop configs
+ .impl(impl) // fall back to no-arg constructor
Review comment:
I don't think that we want to use `CatalogLoader`. The use case for that
is a bit different: it is for Hive, where `Configuration` is the _correct_ way
to pass options. That loader should call whatever dynamic loader function we
introduce in this PR. I also think that it shouldn't be located in the
iceberg-hive-metastore module. That should be in iceberg-mr.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]