leeseven1211 opened a new issue, #12989:
URL: https://github.com/apache/hudi/issues/12989
use bulk insert , only ConsistentBucketBulkInsertDataInternalWriterHelper
and BulkInsertDataInternalWriterHelper ,why is not support
BucketBulkInsertDataInternalWriterHelper
Below is the code snippet:
val writer = writeConfig.getIndexType match {
case HoodieIndex.IndexType.BUCKET if
writeConfig.getBucketIndexEngineType
== BucketIndexEngineType.CONSISTENT_HASHING =>
new ConsistentBucketBulkInsertDataInternalWriterHelper(
table,
writeConfig,
instantTime,
taskPartitionId,
taskId,
taskEpochId,
schema,
writeConfig.populateMetaFields,
arePartitionRecordsSorted,
shouldPreserveHoodieMetadata)
// Is it possible to add support here?
case _ =>
new BulkInsertDataInternalWriterHelper(
table,
writeConfig,
instantTime,
taskPartitionId,
taskId,
taskEpochId,
schema,
writeConfig.populateMetaFields,
arePartitionRecordsSorted,
shouldPreserveHoodieMetadata)
}
**Expected behavior**
add:
case HoodieIndex.IndexType.BUCKET if writeConfig.getBucketIndexEngineType
== BucketIndexEngineType.SIMPLE=>
new BucketBulkInsertDataInternalWriterHelper (
xxx
)
**Environment Description**
* Hudi version : 0.14.0
* Spark version : 3.1.1
* Hive version : 3.1.1
* Hadoop version : 3.1.1
* Storage (HDFS/S3/GCS..) : hdfs
* Running on Docker? (yes/no) : no
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]