danny0405 commented on code in PR #12627:
URL: https://github.com/apache/hudi/pull/12627#discussion_r1914009026
##########
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/table/action/commit/SparkBucketIndexPartitioner.java:
##########
@@ -127,10 +131,13 @@ public BucketInfo getBucketInfo(int bucketNumber) {
if (fileIdOption.isPresent()) {
return new BucketInfo(BucketType.UPDATE, fileIdOption.get(),
partitionPath);
} else {
- // Always write into log file instead of base file if using NB-CC
- BucketType bucketType = isNonBlockingConcurrencyControl ?
BucketType.UPDATE : BucketType.INSERT;
String fileIdPrefix = BucketIdentifier.newBucketFileIdPrefix(bucketId,
isNonBlockingConcurrencyControl);
- return new BucketInfo(bucketType, fileIdPrefix, partitionPath);
+ // Always write into log file instead of base file if using NB-CC
+ if (isNonBlockingConcurrencyControl) {
+ String fileId = FSUtils.createNewFileId(fileIdPrefix, 0);
Review Comment:
Can you make sure whether Flink writer has the 0 as suffix.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]