bithw1 opened a new issue, #11967:
URL: https://github.com/apache/hudi/issues/11967

   I am trying Hudi 0.15.0 and spark 3.3.0.
   
   I have put the hive-site.xml under my $SPARK_HOME/conf, and I startup the 
spark sql with following command:
   ```
   spark-sql \
   --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \
   --conf 
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' \
   --conf 
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
 \
   --conf 'spark.kryo.registrator=org.apache.spark.HoodieSparkKryoRegistrar'
   ```
   
   Then, I create a table with following DDL
   
   ```
   CREATE TABLE hudi_table (
       ts BIGINT,
       uuid STRING,
       rider STRING,
       driver STRING,
       fare DOUBLE,
       city STRING
   ) USING HUDI
   PARTITIONED BY (city)
   LOCATION /tmp/hudi_table
   ```
   
   The table is successfully created,but I got two questions here.
   
   1. When I run show tables with hive cli, I found that the hudi_table shows 
up in hive, it looks that the table def is synced to Hive, but I didn't enable 
hive sync with configuration like `hoodie.datasource.meta.sync.enable` or 
`hoodie.datasource.hive_sync.mode` or sth else, I would ask how this could 
happen.
   2. When I run show create table from the hive cli, it shows that it is an 
external table, which is correct, because I have specified the location in the 
DDL. But when I run show create table from the spark-sql cli, it shows that it 
is an non-external table, which is incorrect, looks to me this is a bug.
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to