pan3793 opened a new issue #1885:
URL: https://github.com/apache/iceberg/issues/1885
We use Avro schema as unified ETL schema management solution. when I'm
trying to write data into Iceberg using Flink, I found there are so many terms
in Flink to represent data types, such as `TypeInformation`, `LogicalType`,
`RowType`, `TableSchema`, `DataType `... I can't figure out the relationship
between them and how to convert each other.
Specifically, my question is *How can I write `DataStream<GenericRecord>`
to an Iceberg table using Flink Iceberg api?* And I think avro `Schema` should
have enough information to desc the Record schema.
Should I use below APIs? If yes, how can I adapt them from
`DataStream<GenericRecord>`?
```
public static <T> Builder builderFor(DataStream<T> input,
MapFunction<T, RowData> mapper,
TypeInformation<RowData> outputType)
```
```
public static Builder forRow(DataStream<Row> input, TableSchema tableSchema)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]