punish-yh commented on issue #9587: URL: https://github.com/apache/hudi/issues/9587#issuecomment-1702123005
> You are right, because you only have one primary key field: `eid`, maybe you should set up the spark key generator as simple. Thank you for your reply, I used `hoodie.table.keygenerator.class=org.apache.hudi.keygen.SimpleAvroKeyGenerator` run this job again. bulk_insert job is successful end. but in upsert mode record was writed to `__HIVE_DEFAULT_PARTITION__` partition, because I configurate `_db` and `_table` field as partition field. but simple key generator does not split partition field, and in getPartitionPath function mismatch fields so that return `__HIVE_DEFAULT_PARTITION__`    now , I can use custom key generator to fix my problem. But I would like to ask if this aligns with Simple key generator initial design ? Myabe i can fix it make simple key generator support mult partition keys -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
