eric9204 opened a new issue, #6742: URL: https://github.com/apache/hudi/issues/6742
**_Tips before filing an issue_** - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? - Join the mailing list to engage in conversations and get faster support at [email protected]. - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly. **Describe the problem you faced** **To Reproduce** Steps to reproduce the behavior: 1. sql statement ``` CREATE TABLE datagen ( col_1 int, col_2 TIMESTAMP, col_4 STRING, col_5 STRING, col_6 STRING, col_7 STRING, col_8 STRING, col_9 STRING, col_10 STRING ) WITH ( 'connector' = 'datagen', 'rows-per-second'='50' ); create table random_hudi ( col_1 int PRIMARY KEY NOT ENFORCED, col_2 TIMESTAMP, col_3 STRING, col_4 STRING, col_5 STRING, col_6 STRING, col_7 STRING, col_8 STRING, col_9 STRING, col_10 STRING ) PARTITIONED BY (`col_3`) WITH ( 'connector'='hudi', 'path'='/tmp/hudi/random_hudi', 'table.type'='MERGE_ON_READ', 'changelog.enabled'='true', 'write.precombine'='true', 'write.precombine.field'='ts', 'write.operation'='upsert', 'write.bucket_assign.tasks'='1', 'write.tasks'='1', 'hoodie.index.type'='BLOOM', 'index.global.enabled'='false', 'compaction.async.enabled'='false', 'compaction.delta_commits'='1', 'compaction.schedule.enabled'='true', 'metadata.enabled'='false' ); insert into random_hudi select col_1, col_2, DATE_FORMAT(localtimestamp, 'yyyyMMddHH') as col_3, col_4, col_5, col_6, col_7, col_8, col_9, col_10 from datagen; ``` 2. ``` [root@host-10 lib]# ll total 369028 -rw-r--r-- 1 root root 85586 Jun 10 14:30 flink-csv-1.14.5.jar -rw-r--r-- 1 root root 136098285 Jun 10 14:34 flink-dist_2.12-1.14.5.jar -rw-r--r-- 1 root root 153142 Jun 10 14:30 flink-json-1.14.5.jar -rw-r--r-- 1 root root 7709731 Jun 9 15:33 flink-shaded-zookeeper-3.4.14.jar -rw-r--r-- 1 root root 39666418 Jun 10 14:32 flink-table_2.12-1.14.5.jar -rw-r--r-- 1 root root 2747878 Sep 22 16:20 guava-27.0-jre.jar -rw-r--r-- 1 root root 94494236 Sep 1 13:44 hudi-flink1.14-bundle-0.12.0.jar.bak -rw-r--r-- 1 root root 94465555 Sep 22 14:12 hudi-flink1.14-bundle-0.13.0-SNAPSHOT.jar -rw-r--r-- 1 root root 115534 Jul 22 11:27 javax.ws.rs-api-2.0.1.jar -rw-r--r-- 1 root root 208006 Jun 9 15:17 log4j-1.2-api-2.17.1.jar -rw-r--r-- 1 root root 301872 Jun 9 15:17 log4j-api-2.17.1.jar -rw-r--r-- 1 root root 1790452 Jun 9 15:17 log4j-core-2.17.1.jar -rw-r--r-- 1 root root 24279 Jun 9 15:17 log4j-slf4j-impl-2.17.1.jar [root@host-10 lib]# grep org.apache.hudi.org.apache.avro.LogicalTypes *.jar Binary file hudi-flink1.14-bundle-0.13.0-SNAPSHOT.jar matches ``` **Expected behavior** A clear and concise description of what you expected to happen. **Environment Description** * Hudi version : Hudi-master 2022.9.22 * Spark version : NONE * Hive version : NONE * Hadoop version : Hadoop-3.3.0 * Storage (HDFS/S3/GCS..) : HDFS * Running on Docker? (yes/no) : No **Additional context** Flink-1.14.5 **Stacktrace** ``` [ERROR] Could not execute SQL statement. Reason: java.lang.ClassNotFoundException: org.apache.hudi.org.apache.avro.LogicalTypes$LocalTimestampMillis ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
