zhangjun0x01 commented on a change in pull request #1558:
URL: https://github.com/apache/iceberg/pull/1558#discussion_r502111390



##########
File path: flink/src/main/java/org/apache/iceberg/flink/FlinkCatalogFactory.java
##########
@@ -67,14 +79,14 @@
    * @return an Iceberg catalog loader
    */
   protected CatalogLoader createCatalogLoader(String name, Map<String, String> 
options) {
-    String catalogType = options.getOrDefault(ICEBERG_CATALOG_TYPE, "hive");
+    String catalogType = options.getOrDefault(ICEBERG_CATALOG_TYPE, 
ICEBERG_CATALOG_TYPE_HIVE);
     switch (catalogType) {
-      case "hive":
+      case ICEBERG_CATALOG_TYPE_HIVE:

Review comment:
       This is what I think kbendick's suggestion is right. We should extract 
some constants, but it is unrelated to load hive-site.xml. I will remove them. 
If necessary ,we can open another PR.

##########
File path: 
flink/src/test/java/org/apache/iceberg/flink/FlinkCatalogTestBase.java
##########
@@ -100,6 +100,8 @@ public FlinkCatalogTestBase(String catalogName, String[] 
baseNamespace) {
     config.put("type", "iceberg");
     config.put(FlinkCatalogFactory.ICEBERG_CATALOG_TYPE, isHadoopCatalog ? 
"hadoop" : "hive");
     config.put(FlinkCatalogFactory.HADOOP_WAREHOUSE_LOCATION, "file:" + 
warehouse);
+    String path = 
this.getClass().getClassLoader().getResource("hive-site.xml").getPath();
+    config.put(FlinkCatalogFactory.HIVE_SITE_PATH, path);

Review comment:
       I refer to the practice in flink, if no scheme is specified, such as 
/tmp/abc, it will be converted to a local file path according to different 
operating system




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to