funny that I was just working on this recently. You can plugin this mapper
to the FlinkSink builder
https://gist.github.com/stevenzwu/4b824556973b47178824852083ab7a50

RowType rowType = FlinkSchemaUtil.convert(icebergSchema);
FlinkSink.builderFor(
        dataStream,
        AvroGenericRecordToRowDataMapper.forAvroSchema(avroSchema),
        FlinkCompatibilityUtil.toTypeInfo(rowType))
    .table(table)
    .tableLoader(tableLoader)
    .writeParallelism(parallelism)
    .append();


On Thu, Mar 31, 2022 at 11:32 AM Hailu, Andreas <andreas.ha...@gs.com>
wrote:

> Hello,
>
>
>
> I’m looking to write a proof of concept incorporating Iceberg into our ETL
> application. We use Avro as our schema management solution, and the Iceberg
> Flink documentation examples only show writing DataStream<Row> and
> DataStream<RowData> types.
>
>
>
> How can this be applied to for DataStream<GenericRecord>? Is there a
> translation step/utility? I found a discussion on an issue from 2020 [2]
> regarding the same thing, but it’s unclear of a solution was reached.
>
>
>
> [1] https://iceberg.apache.org/docs/latest/flink/
>
> [2] https://github.com/apache/iceberg/issues/1885
>
>
>
>
>
> best,
>
> ah
>
>
>
> ------------------------------
>
> Your Personal Data: We may collect and process information about you that
> may be subject to data protection laws. For more information about how we
> use and disclose your personal data, how we protect your information, our
> legal basis to use your information, your rights and who you can contact,
> please refer to: www.gs.com/privacy-notices
>

Reply via email to