[
https://issues.apache.org/jira/browse/FLINK-26311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Alexander Fedulov updated FLINK-26311:
--------------------------------------
Description:
https://issues.apache.org/jira/browse/FLINK-24703 adds new implementation for
CSV format based on StreamFormat.
The following needs to be tested:
# Reading CSV with the DataStream API using FileSource based on POJO schema [1]
# Reading CSV with the DataStream API using FileSource based on customized
schema Jackson (i.e., non-default delimiter)[1]
# Reading and writing CSV with SQL and 'filesystem' connector, including an
option of skipping malformed rows [2]
[1]
[https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/formats/csv/]
[2]
[https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/csv/]
Partial docs examples (missing docs PR pending):
[https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/filesystem/]
was:
https://issues.apache.org/jira/browse/FLINK-24703 adds new implementation for
CSV format based on StreamFormat and BulkWriter and add support for .
The following needs to be tested:
# Reading CSV with the DataStream API using FileSource based on POJO schema [1]
# Reading CSV with the DataStream API using FileSource based on customized
schema Jackson (i.e., non-default delimiter)[1]
# Reading and writing CSV with SQL and 'filesystem' connector, including an
option of skipping malformed rows [2]
[1] [https://github.com/apache/flink/pull/18903]
[2]
[https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/csv/]
Partial docs examples (missing docs PR pending):
[https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/filesystem/]
> Test CsvFormat
> --------------
>
> Key: FLINK-26311
> URL: https://issues.apache.org/jira/browse/FLINK-26311
> Project: Flink
> Issue Type: Improvement
> Reporter: Alexander Fedulov
> Priority: Blocker
> Labels: release-testing
> Fix For: 1.15.0
>
>
> https://issues.apache.org/jira/browse/FLINK-24703 adds new implementation for
> CSV format based on StreamFormat.
> The following needs to be tested:
> # Reading CSV with the DataStream API using FileSource based on POJO schema
> [1]
> # Reading CSV with the DataStream API using FileSource based on customized
> schema Jackson (i.e., non-default delimiter)[1]
> # Reading and writing CSV with SQL and 'filesystem' connector, including an
> option of skipping malformed rows [2]
>
> [1]
> [https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/formats/csv/]
> [2]
> [https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/csv/]
>
> Partial docs examples (missing docs PR pending):
> [https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/filesystem/]
>
--
This message was sent by Atlassian Jira
(v8.20.1#820001)