yihua commented on code in PR #11163:
URL: https://github.com/apache/hudi/pull/11163#discussion_r1596025879
##########
hudi-common/src/test/java/org/apache/hudi/common/testutils/reader/HoodieFileSliceTestUtils.java:
##########
@@ -247,36 +245,31 @@ public static HoodieBaseFile createBaseFile(
Schema schema,
String baseInstantTime
) throws IOException {
- Configuration hadoopConf = new Configuration();
+ StorageConfiguration<Configuration> conf =
HoodieTestUtils.getDefaultStorageConfWithDefaults();
// TODO: Optimize these hard-coded parameters for test purpose. (HUDI-7214)
- BloomFilter filter = BloomFilterFactory.createBloomFilter(
- 1000,
- 0.0001,
- 10000,
- BloomFilterTypeCode.DYNAMIC_V0.name());
- HoodieAvroWriteSupport<IndexedRecord> writeSupport = new
HoodieAvroWriteSupport<>(
- new AvroSchemaConverter().convert(schema),
- schema,
- Option.of(filter),
- new Properties());
- HoodieParquetConfig<HoodieAvroWriteSupport> parquetConfig = new
HoodieParquetConfig(
- writeSupport,
- CompressionCodecName.GZIP,
- ParquetWriter.DEFAULT_BLOCK_SIZE,
- ParquetWriter.DEFAULT_PAGE_SIZE,
- 1024 * 1024 * 1024,
- hadoopConf,
- 0.1,
- true);
-
- try (HoodieAvroFileWriter writer = (HoodieAvroFileWriter)
ReflectionUtils.loadClass(HOODIE_AVRO_PARQUET_WRITER,
- new Class<?>[] {StoragePath.class, HoodieParquetConfig.class,
String.class, TaskContextSupplier.class, boolean.class},
- new StoragePath(baseFilePath),
- parquetConfig,
- baseInstantTime,
- new LocalTaskContextSupplier(),
- true)) {
+ HoodieConfig cfg = new HoodieConfig();
Review Comment:
Should this test class be moved to `hudi-hadoop-common` module?
`HoodieAvroWriteSupport<IndexedRecord>` needs to be tested.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]