xicm commented on issue #11772:
URL: https://github.com/apache/hudi/issues/11772#issuecomment-2290497979

   Table created by dataframe api will init `hoodie.table.precombine.field`. 
When we use `insert into` in spark sql, we don't set 
`hoodie.table.precombine.field` or `hoodie.datasource.write.precombine.field`, 
but `buildHoodieInsertConfig` set precombine to "" if precombine filed is not 
set. 
   
   When we do validation precombine field in table properties is its real 
value, the value in params is "",  and throw an exception.
   
   So we either remove the code in ProvidesHoodieConfig or let the defauilt 
value to be null not empty string.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to