[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] lincoln lee updated FLINK-26760: Fix Version/s: (was: 1.19.0) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Priority: Critical > Labels: pull-request-available > Fix For: 1.20.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] lincoln lee updated FLINK-26760: Fix Version/s: 1.20.0 > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Priority: Critical > Labels: pull-request-available > Fix For: 1.19.0, 1.20.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-26760: --- Labels: pull-request-available (was: ) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Priority: Critical > Labels: pull-request-available > Fix For: 1.19.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Martijn Visser updated FLINK-26760: --- Priority: Critical (was: Major) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Priority: Critical > Fix For: 1.19.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jing Ge updated FLINK-26760: Fix Version/s: 1.19.0 (was: 1.18.0) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Priority: Major > Fix For: 1.19.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Xintong Song updated FLINK-26760: - Fix Version/s: 1.18.0 (was: 1.17.0) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Priority: Major > Fix For: 1.18.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Xingbo Huang updated FLINK-26760: - Fix Version/s: 1.17.0 (was: 1.16.0) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.13.6, 1.14.4, 1.15.0 >Reporter: Lijie Wang >Assignee: Arvid Heise >Priority: Major > Fix For: 1.17.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yun Gao updated FLINK-26760: Fix Version/s: 1.16.0 > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Assignee: Arvid Heise >Priority: Major > Fix For: 1.15.0, 1.16.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Martijn Visser updated FLINK-26760: --- Priority: Major (was: Critical) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Major > Fix For: 1.15.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Lijie Wang updated FLINK-26760: --- Attachment: PerformanceTest.java > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Critical > Fix For: 1.15.0 > > Attachments: PerformanceTest.java, example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Arvid Heise updated FLINK-26760: Issue Type: Improvement (was: Bug) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Improvement > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Critical > Fix For: 1.15.0 > > Attachments: example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Arvid Heise updated FLINK-26760: Priority: Critical (was: Blocker) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Bug > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Critical > Fix For: 1.15.0 > > Attachments: example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Fabian Paul updated FLINK-26760: Priority: Blocker (was: Major) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Bug > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Blocker > Attachments: example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at > org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Fabian Paul updated FLINK-26760: Fix Version/s: 1.15.0 > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Bug > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Blocker > Fix For: 1.15.0 > > Attachments: example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at > org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) > at
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Zhu Zhu updated FLINK-26760: Component/s: Formats (JSON, Avro, Parquet, ORC, SequenceFile) > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Bug > Components: Connectors / FileSystem, Formats (JSON, Avro, Parquet, > ORC, SequenceFile) >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Major > Attachments: example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at > org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) > at
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Lijie Wang updated FLINK-26760: --- Affects Version/s: 1.14.4 1.13.6 1.15.0 > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8 > > > Key: FLINK-26760 > URL: https://issues.apache.org/jira/browse/FLINK-26760 > Project: Flink > Issue Type: Bug > Components: Connectors / FileSystem >Affects Versions: 1.15.0, 1.13.6, 1.14.4 >Reporter: Lijie Wang >Priority: Major > Attachments: example.csv > > > The new CSV source (file system source + CSV format) does not support reading > files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} > supports it. > We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. > When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: > {code:java} > @Test > public void testLegacyCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > CsvTableSource.Builder builder = CsvTableSource.builder(); > CsvTableSource source = > builder.path("example.csv") > .emptyColumnAsNull() > .lineDelimiter("\n") > .fieldDelimiter("|") > .field("name", DataTypes.STRING()) > .build(); > ConnectorCatalogTable catalogTable = > ConnectorCatalogTable.source(source, true); > tEnv.getCatalog(tEnv.getCurrentCatalog()) > .ifPresent( > catalog -> { > try { > catalog.createTable( > new > ObjectPath(tEnv.getCurrentDatabase(), "example"), > catalogTable, > false); > } catch (Exception e) { > throw new RuntimeException(e); > } > }); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > > When reading it with the new CSV source (file system source + CSV format), it > throws the following error: > {code:java} > @Test > public void testNewCsvSource() { > EnvironmentSettings environmentSettings = > EnvironmentSettings.inBatchMode(); > TableEnvironment tEnv = TableEnvironment.create(environmentSettings); > String ddl = > "create table example (" > + "name string" > + ") with (" > + "'connector' = 'filesystem'," > + "'path' = 'example.csv'," > + "'format' = 'csv'," > + "'csv.array-element-delimiter' = '\n'," > + "'csv.field-delimiter' = '|'," > + "'csv.null-literal' = ''" > + ")"; > tEnv.executeSql(ddl); > tEnv.executeSql("select count(name) from example").print(); > } > {code} > {code:java} > Caused by: java.lang.RuntimeException: One or more fetchers have encountered > exception > at > org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) > at > org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) > at > org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) > at > org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) > at > org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) > at > org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) > at > org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) > at >
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Lijie Wang updated FLINK-26760: --- Description: The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} supports it. We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: {code:java} @Test public void testLegacyCsvSource() { EnvironmentSettings environmentSettings = EnvironmentSettings.inBatchMode(); TableEnvironment tEnv = TableEnvironment.create(environmentSettings); CsvTableSource.Builder builder = CsvTableSource.builder(); CsvTableSource source = builder.path("example.csv") .emptyColumnAsNull() .lineDelimiter("\n") .fieldDelimiter("|") .field("name", DataTypes.STRING()) .build(); ConnectorCatalogTable catalogTable = ConnectorCatalogTable.source(source, true); tEnv.getCatalog(tEnv.getCurrentCatalog()) .ifPresent( catalog -> { try { catalog.createTable( new ObjectPath(tEnv.getCurrentDatabase(), "example"), catalogTable, false); } catch (Exception e) { throw new RuntimeException(e); } }); tEnv.executeSql("select count(name) from example").print(); } {code} When reading it with the new CSV source (file system source + CSV format), it throws the following error: {code:java} @Test public void testNewCsvSource() { EnvironmentSettings environmentSettings = EnvironmentSettings.inBatchMode(); TableEnvironment tEnv = TableEnvironment.create(environmentSettings); String ddl = "create table example (" + "name string" + ") with (" + "'connector' = 'filesystem'," + "'path' = 'example.csv'," + "'format' = 'csv'," + "'csv.array-element-delimiter' = '\n'," + "'csv.field-delimiter' = '|'," + "'csv.null-literal' = ''" + ")"; tEnv.executeSql(ddl); tEnv.executeSql("select count(name) from example").print(); } {code} {code:java} Caused by: java.lang.RuntimeException: One or more fetchers have encountered exception at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:741) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:150) at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:105) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
[jira] [Updated] (FLINK-26760) The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8
[ https://issues.apache.org/jira/browse/FLINK-26760?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Lijie Wang updated FLINK-26760: --- Description: The new CSV source (file system source + CSV format) does not support reading files whose file encoding is not UTF-8, but the legacy {{CsvTableSource}} supports it. We provide an {{*example.csv*}} whose file encoding is {{{}ISO-8599-1{}}}. When reading it with the legacy {{{}CsvTableSource{}}}, it executes correctly: {code:java} @Test public void testLegacyCsvSource() { EnvironmentSettings environmentSettings = EnvironmentSettings.inBatchMode(); TableEnvironment tEnv = TableEnvironment.create(environmentSettings); CsvTableSource.Builder builder = CsvTableSource.builder(); CsvTableSource source = builder.path("example.csv") .emptyColumnAsNull() .lineDelimiter("\n") .fieldDelimiter("|") .field("name", DataTypes.STRING()) .build(); ConnectorCatalogTable catalogTable = ConnectorCatalogTable.source(source, true); tEnv.getCatalog(tEnv.getCurrentCatalog()) .ifPresent( catalog -> { try { catalog.createTable( new ObjectPath(tEnv.getCurrentDatabase(), "example"), catalogTable, false); } catch (Exception e) { throw new RuntimeException(e); } }); tEnv.executeSql("select count(name) from example").print(); } {code} When reading it with the new CSV source (file system source + CSV format), it throws the following error: {code:java} @Test public void testNewCsvSource() { EnvironmentSettings environmentSettings = EnvironmentSettings.inBatchMode(); TableEnvironment tEnv = TableEnvironment.create(environmentSettings); String ddl = "create table example (" + "name string" + ") with (" + "'connector' = 'filesystem'," + "'path' = 'example.csv'," + "'format' = 'csv'," + "'csv.array-element-delimiter' = '\n'," + "'csv.field-delimiter' = '|'," + "'csv.null-literal' = ''" + ")"; tEnv.executeSql(ddl); tEnv.executeSql("select count(name) from example").print(); } {code} {code:java} Caused by: java.lang.RuntimeException: One or more fetchers have encountered exception at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.checkErrors(SplitFetcherManager.java:225) at org.apache.flink.connector.base.source.reader.SourceReaderBase.getNextFetch(SourceReaderBase.java:169) at org.apache.flink.connector.base.source.reader.SourceReaderBase.pollNext(SourceReaderBase.java:130) at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:741) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.RuntimeException: SplitFetcher thread 0 received unexpected exception while polling the records at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:150) at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:105) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at