Jack1007 opened a new issue, #5450:
URL: https://github.com/apache/paimon/issues/5450

   ### Search before asking
   
   - [x] I searched in the [issues](https://github.com/apache/paimon/issues) 
and found nothing similar.
   
   
   ### Paimon version
   
   paimon-1.0.1
   
   ### Compute Engine
   
   hive-3.1.2
   spark-3.5.1
   
   ### Minimal reproduce step
   
   step 1:
   spark paimon catalog
   ```
   
spark.sql.extensions=org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions
   spark.sql.catalog.spark_catalog=org.apache.paimon.spark.SparkGenericCatalog
   spark.sql.catalog.spark_catalog.metastore=hive
   spark.sql.catalog.spark_catalog.uri=thrift://metastore:9083
   
spark.sql.catalog.spark_catalog.warehouse=hdfs://nameservice1/user/hive/warehouse
   ```
   create table use the sparksql
   ``` sql
   create table paimondb.test_ts (name string, ts timestamp);
   ```
   step 2:
   show the table's schema on hive
   ```sql
   show create table paimondb.test_ts;
   ```
   
   hive no columns info but the exceptions in the log:
   ```
   java.lang.IllegalArgumentException: Hive DDL and paimon schema mismatched! 
It is recommended not to write any column definition as Paimon external table 
can read schema from the specified location.
   Mismatched fields are:
   Field #7
   Hive DDL          : ts timestamp
   Paimon Schema: ts timestamp with local time zone
   
           at 
org.apache.paimon.hive.HiveSchema.checkFieldsMatched(HiveSchema.java:274) 
~[paimon-hive-connector-3.1-1.0.1.jar:1.0.1]
           at org.apache.paimon.hive.HiveSchema.extract(HiveSchema.java:165) 
~[paimon-hive-connector-3.1-1.0.1.jar:1.0.1]
           at 
org.apache.paimon.hive.PaimonSerDe.initialize(PaimonSerDe.java:67) 
~[paimon-hive-connector-3.1-1.0.1.jar:1.0.1]
           at 
org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54) 
~[hive-exec-3.1.2.jar:3.1.2]
           at 
org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDeWithoutErrorCheck(SerDeUtils.java:562)
 ~[hive-exec-3.1.2.jar:3.1.2]
           at 
org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:87)
 ~[hive-exec-3.1.2.jar:3.1.2]
           at 
org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77)
 ~[hive-exec-3.1.2.jar:3.1.2]
           at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289)
 ~[hive-exec-3.1.2.jar:3.1.2]
           at 
org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:282) 
~[hive-exec-3.1.2.jar:3.1.2]
   ```
   
   ### What doesn't meet your expectations?
   
   spark create table with the timestamp type, the timestamp type default means 
'timestamp with local time zone',and the paimon schema file record the type is 
'TIMESTAMP(6) WITH LOCAL TIME ZONE'.
   Then, use hive show the table's schema, it throw the exceptions.Hive has 
supported the 'timestamp with local time zone' type in the vertion 3.1.2, so 
the error shouldn't  ocurr.
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [ ] I'm willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to