voonhous opened a new pull request, #7480:
URL: https://github.com/apache/hudi/pull/7480

   ### Change Logs
   
   Prior to Hudi's FULL Schema evolution (HFSE) support, Hudi relies on Avro's 
schema-resolution to perform schema evolution.
   
   The exhaustive list of permitted schema-changes that Avro's 
schema-resolution allows for can be found here:
   https://avro.apache.org/docs/1.10.2/spec.html#Schema+Resolution
   
   A summary of the **type changes**  is listed down below:
   ```markdown
   Supported cast conversions:
    - Integer => Long, Float, Double, Decimal*, String*
    - Long => Float, Double, Decimal*, String*
    - Float => Double, Decimal*, String*
    - Double => Decimal*, String*
    - Decimal => Decimal*, String*
    - String => Byte, Decimal*, Date*
    - Byte=> String
    - Date => String*
   
   *type conversions that are supported in Hudi, but not in Native Avro's 
schema-resolution
   ```
   
   The current write execution flow is as such:
   1. deduceWriterSchema to check if the incoming schema is compatible with 
table's schema
   2. deduceWriterSchema's validation is an adaptation of Avro's schema 
compatibility check; if Avro permits such operation, allow execution to proceed
   3. As such if there are implicit schema changes which are compatible with 
Avro's schema resolution feature, HFSE does not need to be enabled
   4. If writer is writing to a different filegroup, this filegroup will be 
written with a new schema; while existing filegroups that are not written to 
will contain the old schema
   
   When reading:
   1. When reading, the same schema will be used for all the filegroup since 
nothing is written to `.schema`.
   2. Parquet reader will throw errors due to type mismatches when reading
   
   This PR fixes the Spark-read issues when implicit schema changes are made 
without enabling HFSE. 
   
   The scope of this fix is limited to Spark-Read + Spark-Write.
   
   TODO: Check if issue exists in Flink reader/writer. 
   
   ### Impact
   
   None; no public APIs changed.
   
   ### Risk level (write none, low medium or high below)
   
   low
   
   ### Documentation Update
   
   _Describe any necessary documentation update if there is any new feature, 
config, or user-facing change_
   
   - _The config description must be updated if new configs are added or the 
default value of the configs are changed_
   - _Any new feature or user-facing change requires updating the Hudi website. 
Please create a Jira ticket, attach the
     ticket number here and follow the 
[instruction](https://hudi.apache.org/contribute/developer-setup#website) to 
make
     changes to the website._
   
   ### Contributor's checklist
   
   - [ ] Read through [contributor's 
guide](https://hudi.apache.org/contribute/how-to-contribute)
   - [ ] Change Logs and Impact were stated clearly
   - [ ] Adequate tests were added if applicable
   - [ ] CI passed
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to