openinx commented on issue #1885:
URL: https://github.com/apache/iceberg/issues/1885#issuecomment-739941275


   @pan3793 , the iceberg flink sink underlying will write  `RowData` into 
avro/parquet/orc files , no matter which data types you    are writing. So we 
provided the `MapFunction<T, RowData> mapper` to convert your own data type 
into `RowData`. So you might need to provide a function to convert 
`GenericRecord` to `RowData`.   The `TypeInformation<RowData>`  specify your 
data type for inputstream . 
   
   We have provided an unit test to write `row` stream to iceberg sink, you may 
want to take a look. 
https://github.com/apache/iceberg/blob/master/flink/src/test/java/org/apache/iceberg/flink/sink/TestFlinkIcebergSink.java#L153
 .     


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to