jiangzzwy opened a new issue, #9474:
URL: https://github.com/apache/hudi/issues/9474
### Environment
- Flink:1.17.1
- hudi:1.14.0-rc
- hadoop:3.2.2
### init.sql script
`
SET 'state.checkpoints.dir' = 'hdfs:///hudi/checkpoints/';
SET 'execution.checkpointing.interval' = '20s';
SET 'execution.checkpointing.min-pause' = '5s';
SET 'execution.checkpointing.max-concurrent-checkpoints' = '1';
add jar '/export/server/flink-1.17.1/hudi-flink1.17-bundle-0.14.0-rc1.jar;
create table t_hudi_user(
id BIGINT,
name STRING,
age INT,
sex BOOLEAN,
city String,
birth timestamp(3)
)
PARTITIONED BY (birth)
WITH
(
'connector' = 'hudi',
'hoodie.datasource.write.recordkey.field' = 'id',
'path' = 'hdfs://CentOS:9000/hudi/t_hudi_user',
'table.type' = 'MERGE_ON_READ',
'compaction.trigger.strategy' = 'num_or_time',
'compaction.delta_commits' = '3',
'compaction.delta_seconds' = '300',
'hoodie.datasource.write.hive_style_partitioning' = 'true',
'write.datetime.partitioning'='true',
'write.partition.format'='yyyy-MM-dd',
'hive_sync.assume_date_partitioning' = 'true',
'hive_sync.mode' = 'hms',
'write.precombine.field' = 'birth',
'changelog.enabled' = 'true',
'read.streaming.enabled' = 'true',
'read.streaming.check-interval' = '3',
'compaction.tasks' = '2',
'hive_sync.enable' = 'true',
'hive_sync.table' = 't_hudi_user',
'hive_sync.db' = 'default',
'hive_sync.metastore.uris' = 'thrift://192.168.42.129:9083',
'hoodie.datasource.hive_sync.support_timestamp' = 'true'
);
`
when i execute command as follow,the console termnal raise such error
`java.lang.ClassNotFoundException:
org.apache.hudi.table.format.mor.MergeOnReadInputSplit`.I'm sure the
`MergeOnReadInputSplit` class has already complied in
·hudi-flink1.17-bundle-0.14.0-rc1.jar` jar file.
`


But inserting is okay, querying is not, which makes me feel very strange!!

I had try a lower flink verison 1.14.x has no tihs problem
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]