zjureel opened a new pull request, #480:
URL: https://github.com/apache/flink-table-store/pull/480

   This PR aims to support column type evolution for file data in Table Store. 
For example, the fields of underlying data is [(1, a, int), (2, b, int), (3, c, 
int), (4, d, int)] and the data is (1, 2, 3, 4). After schema evolution the 
fields of table is [(2, bb, bigint), (4, a, float), (5, a, int), (6, c, int)]. 
When Table Store read data from file according to table fields, it should read 
integer value [b->2, d->4] from file and cast the integer value to bigint and 
float.
   
   The main changes in this PR are as follows
   1. Added `FieldGetterCastExecutor` to read value from underlying data by 
`FieldGetter` and cast it by `CastExecutor`
   2. Added `CastedRowData` which is an implementation of `RowData` with 
`FieldGetterCastExecutor`
   3. Added `IndexCastMapping` to manage index mapping and cast mapping which 
can be created together
   4. Updated `AbstractFileRecordIterator` to create `CastedRowData` when 
there's column type evolution
   5. Updated `SchemaEvolutionUtil` to create index and cast mapping
   
   The main tests are as follows
   1. Added `AppendOnlyTableColumnTypeFileDataTest`, 
`ChangelogValueCountColumnTypeFIleDataTest` and 
`ChngelogWithKeyColumnTypeFileDataTest` to test reading data from table before 
and after column type evolution
   2. Updated `SparkSchemaEvolutionITCase` to read and verify data before and 
after complex column evolution
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to