n3nash commented on a change in pull request #1355: [HUDI-633] limit archive
file block size by number of bytes
URL: https://github.com/apache/incubator-hudi/pull/1355#discussion_r384112616
##########
File path:
hudi-client/src/main/java/org/apache/hudi/io/HoodieCommitArchiveLog.java
##########
@@ -245,11 +249,23 @@ public void archive(List<HoodieInstant> instants) throws
HoodieCommitException {
Schema wrapperSchema = HoodieArchivedMetaEntry.getClassSchema();
LOG.info("Wrapper schema " + wrapperSchema.toString());
List<IndexedRecord> records = new ArrayList<>();
+ long totalInMemSize = 0;
for (HoodieInstant hoodieInstant : instants) {
try {
- records.add(convertToAvroRecord(commitTimeline, hoodieInstant));
- if (records.size() >= this.config.getCommitArchivalBatchSize()) {
+ IndexedRecord record = convertToAvroRecord(commitTimeline,
hoodieInstant);
+ totalInMemSize += this.sizeEstimator.sizeEstimate(record);
Review comment:
Why do we need to call the sizeEstimate for every record ? Can we call it
for X number of records and use that average ? SizeEsimation of an object is
CPU intensive and time consuming..
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services