hudi-bot opened a new issue, #17357: URL: https://github.com/apache/hudi/issues/17357
[https://github.com/apache/hudi/pull/12577] test repro that create a table with precombine key column "ts bigint" and insert 1 record, it always complains for the ts column value it cannot cast string to bigint, even though the insert explicitly cast the value as bigint. The issue is because spark enforce the partition column to be the last column. Need to follow up with doc update. ``` spark.sql( s""" |create table $tableName (| |id bigint,| |name string,| |price double,| |ts bigint,| |dt string| |) using hudi| |tblproperties (| |type = 'mor',| |primaryKey = 'id',| |precombineKey = 'ts'| |)| |partitioned by(dt)| |location '${tmp.getCanonicalPath}' """.stripMargin) ``` | ## JIRA info - Link: https://issues.apache.org/jira/browse/HUDI-8827 - Type: Sub-task - Parent: https://issues.apache.org/jira/browse/HUDI-9109 - Fix version(s): - 1.1.0 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
