xccui opened a new issue, #5870: URL: https://github.com/apache/hudi/issues/5870
**Describe the problem you faced** Hello! We use Flink to write some PostgresCDC data to a Hudi table. The table is partitioned by a custom date format (as shown below). ``` setString(FlinkOptions.PARTITION_PATH_FIELD, "timestamp_field") setString(FlinkOptions.KEYGEN_TYPE, KeyGeneratorType.TIMESTAMP.toString()) setString(KeyGeneratorOptions.Config.TIMESTAMP_TYPE_FIELD_PROP, "EPOCHMILLISECONDS") setString(KeyGeneratorOptions.Config.TIMESTAMP_OUTPUT_DATE_FORMAT_PROP, "yyyy") setString(KeyGeneratorOptions.Config.TIMESTAMP_TIMEZONE_FORMAT_PROP, "UTC+0:00") ``` When querying the table with Flink SQL, we got the `DateTimeParseException` exception. After some tests, we found that the table can be successfully queried with this extra option `'hoodie.datasource.write.partitionpath.field'=''`. **Environment Description** * Hudi version : 0.12.0-SNAPSHOT on master * Flink version : 1.14 * Storage (HDFS/S3/GCS..) : S3 * Running on Docker? (yes/no) : no **Additional context** Not sure if it's relevant, but the default values are different for the option `hoodie.datasource.write.partitionpath.field` in `FlinkOptions.java` and `KeyGeneratorOptions.java`. **Stacktrace** ```[2022-06-14 22:40:53] Caused by: java.time.format.DateTimeParseException: Text '2022' could not be parsed at index 4 [2022-06-14 22:40:53] at java.time.format.DateTimeFormatter.parseResolved0(DateTimeFormatter.java:1949) [2022-06-14 22:40:53] at java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1851) [2022-06-14 22:40:53] at java.time.LocalDateTime.parse(LocalDateTime.java:492) [2022-06-14 22:40:53] at java.time.LocalDateTime.parse(LocalDateTime.java:477) [2022-06-14 22:40:53] at org.apache.flink.table.filesystem.RowPartitionComputer.restorePartValueFromType(RowPartitionComputer.java:122) [2022-06-14 22:40:53] at org.apache.flink.table.filesystem.RowPartitionComputer.restorePartValueFromType(RowPartitionComputer.java:84) [2022-06-14 22:40:53] at org.apache.hudi.table.format.mor.MergeOnReadInputFormat.lambda$getReader$0(MergeOnReadInputFormat.java:302) [2022-06-14 22:40:53] at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684) [2022-06-14 22:40:53] at org.apache.hudi.table.format.mor.MergeOnReadInputFormat.getReader(MergeOnReadInputFormat.java:302) [2022-06-14 22:40:53] at org.apache.hudi.table.format.mor.MergeOnReadInputFormat.getFullSchemaReader(MergeOnReadInputFormat.java:288) [2022-06-14 22:40:53] at org.apache.hudi.table.format.mor.MergeOnReadInputFormat.open(MergeOnReadInputFormat.java:205) [2022-06-14 22:40:53] at org.apache.hudi.table.format.mor.MergeOnReadInputFormat.open(MergeOnReadInputFormat.java:81) [2022-06-14 22:40:53] at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:84) [2022-06-14 22:40:53] at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110) [2022-06-14 22:40:53] at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66) [2022-06-14 22:40:53] at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:269)``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
