ferenc-csaky commented on code in PR #23063:
URL: https://github.com/apache/flink/pull/23063#discussion_r1287523132


##########
flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/catalog/FileCatalogStore.java:
##########
@@ -97,25 +114,29 @@ public void storeCatalog(String catalogName, 
CatalogDescriptor catalog)
             throws CatalogException {
         checkOpenState();
 
-        Path filePath = getCatalogPath(catalogName);
+        Path catalogPath = getCatalogPath(catalogName);
         try {
-            File file = filePath.toFile();
-            if (file.exists()) {
+            FileSystem fs = catalogPath.getFileSystem();
+
+            if (fs.exists(catalogPath)) {
                 throw new CatalogException(
                         String.format(
                                 "Catalog %s's store file %s is already exist.",
-                                catalogName, filePath));
+                                catalogName, catalogPath));
+            }
+
+            try (FSDataOutputStream os = fs.create(catalogPath, 
WriteMode.NO_OVERWRITE)) {
+                YAML_MAPPER.writeValue(os, catalog.getConfiguration().toMap());
             }
-            // create a new file
-            file.createNewFile();
-            String yamlString = 
yaml.dumpAsMap(catalog.getConfiguration().toMap());
-            FileUtils.writeFile(file, yamlString, charset);
-            LOG.info("Catalog {}'s configuration saved to file {}", 
catalogName, filePath);
-        } catch (Throwable e) {
+
+            LOG.info("Catalog {}'s configuration saved to file {}", 
catalogName, catalogPath);
+        } catch (CatalogException e) {
+            throw e;
+        } catch (Exception e) {

Review Comment:
   I'm not sure what do you mean by refine. For example separating the 
operations that can throw not expected checked exception to multiple try/catch 
blocks to avoid catching `CatalogException` explicitly would only complicate 
the code more, I think. IMO this is not an anti-pattern at all, there are 
multiple examples in the codebase for it already, e.g. 
[here](https://github.com/apache/flink/blob/dfb9cb851dc1f0908ea6c3ce1230dd8ca2b48733/flink-filesystems/flink-hadoop-fs/src/main/java/org/apache/flink/runtime/fs/hdfs/HadoopFsFactory.java#L193).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to