voonhous commented on code in PR #14311:
URL: https://github.com/apache/hudi/pull/14311#discussion_r2579668351
##########
hudi-utilities/src/main/java/org/apache/hudi/utilities/HoodieMetadataTableValidator.java:
##########
@@ -863,13 +856,9 @@ && compareTimestamps(
}
/**
- * Compare the file listing and index data between metadata table and
fileSystem.
- * For now, validate five kinds of apis:
- * 1. HoodieTableFileSystemView::getLatestFileSlices
- * 2. HoodieTableFileSystemView::getLatestBaseFiles
- * 3. HoodieTableFileSystemView::getAllFileGroups and
HoodieTableFileSystemView::getAllFileSlices
- * 4. HoodieTableFileSystemView::getColumnStats
- * 5. HoodieTableFileSystemView::getBloomFilters
+ * Compare the file listing and index data between metadata table and
fileSystem. For now, validate five kinds of apis: 1.
HoodieTableFileSystemView::getLatestFileSlices 2.
Review Comment:
Will revert all formatting changes manually.
##########
hudi-utilities/src/main/java/org/apache/hudi/utilities/HoodieMetadataTableValidator.java:
##########
@@ -1009,14 +998,12 @@ private static List<FileSlice>
filterFileSliceBasedOnInflightCleaning(List<FileS
private List<HoodieBaseFile>
filterBaseFileBasedOnInflightCleaning(List<HoodieBaseFile> sortedBaseFileList,
Set<String> baseDataFilesForCleaning) {
return sortedBaseFileList.stream()
- .filter(baseFile -> {
- return !baseDataFilesForCleaning.contains(baseFile.getFileName());
- }).collect(Collectors.toList());
+ .filter(baseFile ->
!baseDataFilesForCleaning.contains(baseFile.getFileName())).collect(Collectors.toList());
}
@SuppressWarnings("rawtypes")
private void validateAllColumnStats(HoodieMetadataValidationContext
metadataTableBasedContext, HoodieMetadataValidationContext fsBasedContext,
- String partitionPath, Set<String>
baseDataFilesForCleaning) throws Exception {
+ String partitionPath, Set<String> baseDataFilesForCleaning) throws
Exception {
Review Comment:
Will revert all formatting changes manually.
##########
hudi-utilities/src/main/java/org/apache/hudi/utilities/HoodieMetadataTableValidator.java:
##########
@@ -1857,8 +1839,8 @@ public HoodieMetadataValidationContext(
}
private HoodieTableFileSystemView getFileSystemView(HoodieEngineContext
context,
- HoodieTableMetaClient
metaClient, HoodieMetadataConfig metadataConfig,
-
FileSystemViewStorageConfig viewConf, HoodieCommonConfig commonConfig) {
+ HoodieTableMetaClient metaClient, HoodieMetadataConfig metadataConfig,
Review Comment:
Will revert all formatting changes manually.
##########
hudi-spark-datasource/hudi-spark/src/test/java/org/apache/hudi/client/functional/TestMetadataUtilRLIandSIRecordGeneration.java:
##########
@@ -605,22 +596,22 @@ public void testReducedByKeysForRLIRecords() throws
IOException {
fail("Should not have reached here");
} catch (Exception e) {
// expected. no-op
- assertTrue(e.getCause() instanceof HoodieIOException);
+ assertInstanceOf(HoodieIOException.class, e.getCause());
Review Comment:
Will revert all formatting changes manually.
##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/ColumnStatsIndexSupport.scala:
##########
@@ -57,7 +57,7 @@ import scala.collection.parallel.mutable.ParHashMap
class ColumnStatsIndexSupport(spark: SparkSession,
tableSchema: StructType,
- avroSchema: Schema,
+ hoodieSchema: HoodieSchema,
Review Comment:
Done
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]