Anandonzy opened a new issue, #8844:
URL: https://github.com/apache/hudi/issues/8844
**_When I select hive **_rt** table have this is error _**
- I use flink1.16 create hudi0.13 MOR table and use "hms" syns hive table.
- select * from hudi_hive_test.t1_20230530_type_mor_sink_rt limit 10;
**Describe the problem you faced**
org.apache.hive.service.cli.HiveSQLException: java.io.IOException:
org.apache.hudi.org.apache.avro.AvroRuntimeException: Not a record: "int"
**To Reproduce**
Steps to reproduce the behavior:
1.source table DDL
```sql
CREATE TABLE hudi_test.datagen_source_20230530 (
user_id BIGINT,
age INT,
sex STRING,
score DOUBLE,
amount DECIMAL(10, 2),
numbers ARRAY<INT>,
person ROW<id INT, name
STRING>,
grade MAP<STRING, INT>,
my_date DATE,
my_float FLOAT
) WITH (
'connector' = 'datagen',
'rows-per-second' = '1',
'fields.user_id.kind' = 'random',
'fields.user_id.min' = '1',
'fields.user_id.max' = '1000',
'fields.age.kind' = 'random',
'fields.age.min' = '18',
'fields.age.max' = '60',
'fields.sex.kind' = 'random',
'fields.score.kind' = 'random',
'fields.score.min' = '0.0',
'fields.score.max' = '100.0',
'fields.amount.kind' = 'random',
'fields.amount.min' = '0.0',
'fields.amount.max' = '1000.0',
'fields.numbers.kind' = 'random',
'fields.numbers.element.min' = '1',
'fields.numbers.element.max' = '100',
'fields.person.kind' = 'random',
'fields.person.id.min' = '1',
'fields.person.id.max' = '1000',
'fields.person.name.length' = '5',
'fields.grade.kind' = 'random',
'fields.grade.key.length' = '5',
'fields.grade.value.min' = '1',
'fields.grade.value.max' = '100',
'fields.my_date.kind' = 'random',
'fields.my_float.kind' = 'random',
'fields.my_float.min' = '0.0',
'fields.my_float.max' = '100.0'
);
```
2.hudi table DDL
```sql
create table hudi_test.t1_20230530_type_mor_sink(
user_id BIGINT,
age INT,
sex STRING,
score DOUBLE,
amount DECIMAL(10, 2),
numbers ARRAY<INT>,
person ROW<id INT,
name STRING>,
grade MAP<STRING, INT>,
my_date DATE,
my_float FLOAT
)
with(
'connector'='hudi',
'path' =
'hdfs://user/hive/warehouse/hudi_test/t1_20230530_type_mor_sink',
'table.type'='MERGE_ON_READ',
'hoodie.datasource.write.recordkey.field' = 'user_id',
'hoodie.datasource.write.precombine.field' = 'age',
'write.bucket_assign.tasks'='1',
'write.tasks' = '1',
'compaction.tasks' = '1',
'compaction.async.enabled' = 'true',
'compaction.schedule.enabled' = 'true',
'compaction.trigger.strategy' = 'num_commits',
'compaction.delta_commits' = '2',
'read.streaming.enabled' = 'true',
'changelog.enabled' = 'true',
'read.streaming.skip_compaction' = 'true',
'hive_sync.enable'='true',
'hive_sync.mode' = 'hms',
'hive_sync.metastore.uris' = 'thrift://0.0.0.0:0000',
'hive_sync.db'='hudi_hive_test',
'hive_sync.table'='t1_20230530_type_mor_sink',
'hadoop.dfs.namenode.acls.enabled' = 'false'
);
```

3.select * from hudi_hive_test.t1_20230530_type_mor_sink_rt limit 10; have
this error

4. select other fileds no error
```sql
select
user_id,
age,
sex,
score,
amount,
person,
grade,
my_date,
my_float
from hudi_hive_test.t1_20230530_type_mor_sink_rt limit 10;
```

Overall process:

**Expected behavior**
I think this is a bug,Bug I am not sure.
**Environment Description**
* Hudi version : 0.13
* Spark version :
* Hive version :3.1.2
* Hadoop version : 3.3.1
* Storage (HDFS/S3/GCS..) : HDFS
* Running on Docker? (yes/no) : no
**Additional context**
I have use `ARRAY<INT>` or `ARRAY<LONG>` always have this error
**Stacktrace**
```
org.apache.hive.service.cli.HiveSQLException: java.io.IOException:
org.apache.hudi.org.apache.avro.AvroRuntimeException: Not a record: "int"
```
If you know something about this, can you help me. ths
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]