wgtmac commented on code in PR #1026:
URL: https://github.com/apache/parquet-mr/pull/1026#discussion_r1105191438


##########
parquet-hadoop/src/main/java/org/apache/parquet/hadoop/rewrite/ParquetRewriter.java:
##########
@@ -183,12 +186,61 @@ public ParquetRewriter(TransParquetFileReader reader,
     }
   }
 
+  // Open all input files to validate their schemas are compatible to merge
+  private void openInputFiles(List<Path> inputFiles, Configuration conf) {
+    Preconditions.checkArgument(inputFiles != null && !inputFiles.isEmpty(), 
"No input files");
+
+    for (Path inputFile : inputFiles) {
+      try {
+        TransParquetFileReader reader = new TransParquetFileReader(
+                HadoopInputFile.fromPath(inputFile, conf), 
HadoopReadOptions.builder(conf).build());
+        MessageType inputFileSchema = 
reader.getFooter().getFileMetaData().getSchema();
+        if (this.schema == null) {
+          this.schema = inputFileSchema;
+        } else {
+          // Now we enforce equality of schemas from input files for 
simplicity.
+          if (!this.schema.equals(inputFileSchema)) {
+            throw new InvalidSchemaException("Input files have different 
schemas");
+          }
+        }
+        
this.allOriginalCreatedBys.add(reader.getFooter().getFileMetaData().getCreatedBy());
+        this.inputFiles.add(reader);
+      } catch (IOException e) {
+        throw new IllegalArgumentException("Failed to open input file: " + 
inputFile, e);
+      }
+    }
+
+    extraMetaData.put(ORIGINAL_CREATED_BY_KEY, String.join("\n", 
allOriginalCreatedBys));

Review Comment:
   > Do we do dedup?
   
   Yes, `allOriginalCreatedBys` is a HashSet which does the job. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@parquet.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to