linliu-code opened a new issue, #14292:
URL: https://github.com/apache/hudi/issues/14292

   ### Bug Description
   
   >>> spark.sql('''CREATE TABLE IF NOT EXISTS trips_quickstart1 (
   ...     ts BIGINT,
   ...     uuid STRING,
   ...     rider STRING,
   ...     driver STRING,
   ...     fare DOUBLE,
   ...     city STRING
   ... ) USING HUDI
   ... PARTITIONED BY (city)
   ... LOCATION 's3a://<table path>'
   ... TBLPROPERTIES (
   ...     primaryKey = 'uuid',
   ...     preCombineField = 'ts'
   ... )''').show(100)
   
   >>> spark.sql("describe extended trips_quickstart1").show(100)
   +--------------------+--------------------+-------+
   |            col_name|           data_type|comment|
   +--------------------+--------------------+-------+
   | _hoodie_commit_time|              string|   NULL|
   |_hoodie_commit_seqno|              string|   NULL|
   |  _hoodie_record_key|              string|   NULL|
   |_hoodie_partition...|              string|   NULL|
   |   _hoodie_file_name|              string|   NULL|
   |                  ts|              bigint|   NULL|
   |                uuid|              string|   NULL|
   |               rider|              string|   NULL|
   |              driver|              string|   NULL|
   |                fare|              double|   NULL|
   |                city|              string|   NULL|
   |# Partition Infor...|                    |       |
   |          # col_name|           data_type|comment|
   |                city|              string|   NULL|
   |                    |                    |       |
   |# Detailed Table ...|                    |       |
   |             Catalog|       spark_catalog|       |
   |            Database|                test|       |
   |               Table|   trips_quickstart1|       |
   |               Owner|              hadoop|       |
   |        Created Time|Sun Nov 16 15:22:...|       |
   |         Last Access|             UNKNOWN|       |
   |          Created By|  Spark 3.5.2-amzn-1|       |
   |                Type|            EXTERNAL|       |
   |            Provider|                hudi|       |
   |    Table Properties|[preCombineField=...|       |
   |            Location|s3a://<table path>...|       |
   |       Serde Library|org.apache.hadoop...|       |
   |         InputFormat|org.apache.hudi.h...|       |
   |        OutputFormat|org.apache.hadoop...|       |
   +--------------------+--------------------+-------+
   
   >>> spark.sql("set hoodie.schema.on.read.enable=true").show(100)
   +--------------------+-----+
   |                 key|value|
   +--------------------+-----+
   |hoodie.schema.on....| true|
   +--------------------+-----+
   
   >>> spark.sql("describe extended trips_quickstart1").show(100)
   +--------------------+--------------------+-------+
   |            col_name|           data_type|comment|
   +--------------------+--------------------+-------+
   | _hoodie_commit_time|              string|   NULL|
   |_hoodie_commit_seqno|              string|   NULL|
   |  _hoodie_record_key|              string|   NULL|
   |_hoodie_partition...|              string|   NULL|
   |   _hoodie_file_name|              string|   NULL|
   |                  ts|              bigint|   NULL|
   |                uuid|              string|   NULL|
   |               rider|              string|   NULL|
   |              driver|              string|   NULL|
   |                fare|              double|   NULL|
   |                city|              string|   NULL|
   |# Partition Infor...|                    |       |
   |          # col_name|           data_type|comment|
   |                city|              string|   NULL|
   |                    |                    |       |
   |# Detailed Table ...|                    |       |
   |                Name|spark_catalog.tes...|       |
   |                Type|             MANAGED|       |
   |            Provider|                hudi|       |
   |    Table Properties|[preCombineField=...|       |
   +--------------------+--------------------+-------+
   
   
   ### Environment
   
   **Hudi version:** 0.15.0
   **Query engine:** Spark
   **Relevant configs:** `hoodie.schema.on.read.enable`
   
   
   ### Logs and Stack Trace
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to