punish-yh commented on issue #9587:
URL: https://github.com/apache/hudi/issues/9587#issuecomment-1702123005

   > You are right, because you only have one primary key field: `eid`, maybe 
you should set up the spark key generator as simple.
   
   Thank you for your reply, I used  
`hoodie.table.keygenerator.class=org.apache.hudi.keygen.SimpleAvroKeyGenerator` 
run this job again.
   
   bulk_insert job is successful end. but in upsert mode record was writed to 
`__HIVE_DEFAULT_PARTITION__` partition, because I configurate `_db` and 
`_table` field as  partition field. but simple key generator does not split 
partition field, and in getPartitionPath function mismatch fields so that 
return  `__HIVE_DEFAULT_PARTITION__`
   
   
![image](https://github.com/apache/hudi/assets/59658062/af0dfaff-3cc6-4758-b315-c3aaedfe0b14)
   
![image](https://github.com/apache/hudi/assets/59658062/fdc6590d-6c56-4d08-9a44-6725e3b48742)
   
![image](https://github.com/apache/hudi/assets/59658062/998a196e-6f81-4b82-aef8-0c440b7af297)
   
   now , I can use custom key generator to fix my problem.
   
   
   But I would like to ask if this aligns with Simple key generator initial 
design ? Myabe i can fix it make simple key generator support mult partition 
keys


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to