TeRS-K commented on a change in pull request #2740:
URL: https://github.com/apache/hudi/pull/2740#discussion_r609825554



##########
File path: hudi-cli/src/main/scala/org/apache/hudi/cli/SparkHelpers.scala
##########
@@ -40,7 +40,7 @@ import scala.collection.mutable._
 object SparkHelpers {
   @throws[Exception]
   def skipKeysAndWriteNewFile(instantTime: String, fs: FileSystem, sourceFile: 
Path, destinationFile: Path, keysToSkip: Set[String]) {
-    val sourceRecords = ParquetUtils.readAvroRecords(fs.getConf, sourceFile)
+    val sourceRecords = new ParquetUtils().readAvroRecords(fs.getConf, 
sourceFile)

Review comment:
       Yes that makes sense. How about the ParquetUtils instance in 
`HoodieSparkBootstrapSchemaProvider::getBootstrapSourceSchema` and 
`HoodieParquetReader`, should I replace those with `DataFileUtils.getInstance` 
to get the instance as well?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to