[jira] [Created] (FLINK-35566) Consider promoting TypeSerializer from PublicEvolving to Public
Martijn Visser created FLINK-35566: -- Summary: Consider promoting TypeSerializer from PublicEvolving to Public Key: FLINK-35566 URL: https://issues.apache.org/jira/browse/FLINK-35566 Project: Flink Issue Type: Technical Debt Components: API / Core Reporter: Martijn Visser While working on implementing FLINK-35378, I ran into the problem that TypeSerializer is still on PublicEvolving since Flink 1.0. We should consider annotating this as Public. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35378) [FLIP-453] Promote Unified Sink API V2 to Public and Deprecate SinkFunc
Martijn Visser created FLINK-35378: -- Summary: [FLIP-453] Promote Unified Sink API V2 to Public and Deprecate SinkFunc Key: FLINK-35378 URL: https://issues.apache.org/jira/browse/FLINK-35378 Project: Flink Issue Type: Technical Debt Components: API / Core Reporter: Martijn Visser Assignee: Martijn Visser https://cwiki.apache.org/confluence/pages/resumedraft.action?draftId=303794871=af4ace88-98b7-4a53-aece-cd67d2f91a15; -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35350) Add documentation for Kudu
Martijn Visser created FLINK-35350: -- Summary: Add documentation for Kudu Key: FLINK-35350 URL: https://issues.apache.org/jira/browse/FLINK-35350 Project: Flink Issue Type: Sub-task Components: Connectors / Kudu Reporter: Martijn Visser Fix For: kudu-2.0.0 There's currently no documentation for Kudu; this should be added -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35333) JdbcXaSinkTestBase fails in weekly Flink JDBC Connector tests
Martijn Visser created FLINK-35333: -- Summary: JdbcXaSinkTestBase fails in weekly Flink JDBC Connector tests Key: FLINK-35333 URL: https://issues.apache.org/jira/browse/FLINK-35333 Project: Flink Issue Type: Bug Components: Connectors / JDBC Affects Versions: jdbc-3.2.0 Reporter: Martijn Visser https://github.com/apache/flink-connector-jdbc/actions/runs/9047366679/job/24859224407#step:15:147 {code:java} Error: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile (default-testCompile) on project flink-connector-jdbc: Compilation failure Error: /home/runner/work/flink-connector-jdbc/flink-connector-jdbc/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/xa/JdbcXaSinkTestBase.java:[164,37] is not abstract and does not override abstract method getTaskInfo() in org.apache.flink.api.common.functions.RuntimeContext {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35280) Migrate HBase Sink connector to use the ASync Sink API
Martijn Visser created FLINK-35280: -- Summary: Migrate HBase Sink connector to use the ASync Sink API Key: FLINK-35280 URL: https://issues.apache.org/jira/browse/FLINK-35280 Project: Flink Issue Type: Technical Debt Components: Connectors / HBase Affects Versions: hbase-3.0.0, hbase-3.0.1, hbase-4.0.0 Reporter: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35109) Drop support for Flink 1.17 and 1.18 in Flink Kafka connector
Martijn Visser created FLINK-35109: -- Summary: Drop support for Flink 1.17 and 1.18 in Flink Kafka connector Key: FLINK-35109 URL: https://issues.apache.org/jira/browse/FLINK-35109 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser Fix For: kafka-4.0.0 The Flink Kafka connector currently can't compile against Flink 1.20-SNAPSHOT. An example failure can be found at https://github.com/apache/flink-connector-kafka/actions/runs/8659822490/job/23746484721#step:15:169 The {code:java} TypeSerializerUpgradeTestBase{code} has had issues before, see FLINK-32455. See also specifically the comment in https://issues.apache.org/jira/browse/FLINK-32455?focusedCommentId=17739785=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17739785 Next to that, there's also FLINK-25509 which can only be supported with Flink 1.19 and higher. So we should: * Drop support for 1.17 and 1.18 * Refactor the Flink Kafka connector to use the new {code:java}MigrationTest{code} We will support the Flink Kafka connector for Flink 1.18 via the v3.1 branch; this change will be a new v4.0 version with support for Flink 1.19 and the upcoming Flink 1.20 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35009) Change on getTransitivePredecessors breaks connectors
Martijn Visser created FLINK-35009: -- Summary: Change on getTransitivePredecessors breaks connectors Key: FLINK-35009 URL: https://issues.apache.org/jira/browse/FLINK-35009 Project: Flink Issue Type: Bug Components: API / Core, Connectors / Kafka Affects Versions: 1.18.2, 1.20.0, 1.19.1 Reporter: Martijn Visser {code:java} Error: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile (default-testCompile) on project flink-connector-kafka: Compilation failure: Compilation failure: Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java:[214,24] org.apache.flink.streaming.connectors.kafka.testutils.DataGenerators.InfiniteStringsGenerator.MockTransformation is not abstract and does not override abstract method getTransitivePredecessorsInternal() in org.apache.flink.api.dag.Transformation Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/DataGenerators.java:[220,44] getTransitivePredecessors() in org.apache.flink.streaming.connectors.kafka.testutils.DataGenerators.InfiniteStringsGenerator.MockTransformation cannot override getTransitivePredecessors() in org.apache.flink.api.dag.Transformation Error:overridden method is final {code} Example: https://github.com/apache/flink-connector-kafka/actions/runs/8494349338/job/23269406762#step:15:167 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35008) Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink Kafka connector
Martijn Visser created FLINK-35008: -- Summary: Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink Kafka connector Key: FLINK-35008 URL: https://issues.apache.org/jira/browse/FLINK-35008 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-35007) Update Flink Kafka connector to support 1.19 and test 1.20-SNAPSHOT
Martijn Visser created FLINK-35007: -- Summary: Update Flink Kafka connector to support 1.19 and test 1.20-SNAPSHOT Key: FLINK-35007 URL: https://issues.apache.org/jira/browse/FLINK-35007 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34621) Bump com.google.guava:guava from 31.1-jre to 32.0.0-jre in /flink-connector-hbase-base
Martijn Visser created FLINK-34621: -- Summary: Bump com.google.guava:guava from 31.1-jre to 32.0.0-jre in /flink-connector-hbase-base Key: FLINK-34621 URL: https://issues.apache.org/jira/browse/FLINK-34621 Project: Flink Issue Type: Technical Debt Components: Connectors / HBase Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34515) Document new FLIP process
Martijn Visser created FLINK-34515: -- Summary: Document new FLIP process Key: FLINK-34515 URL: https://issues.apache.org/jira/browse/FLINK-34515 Project: Flink Issue Type: New Feature Components: Documentation Reporter: Martijn Visser Assignee: Martijn Visser Per https://lists.apache.org/thread/rkpvlnwj9gv1hvx1dyklx6k88qpnvk2t Contributors create a Google Doc and make that view-only, and post that Google Doc to the mailing list for a discussion thread. When the discussions have been resolved, the contributor ask on the Dev mailing list to a committer/PMC to copy the contents from the Google Doc, and create a FLIP number for them. The contributor can then use that FLIP to actually have a VOTE thread. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34461) MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17
Martijn Visser created FLINK-34461: -- Summary: MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17 Key: FLINK-34461 URL: https://issues.apache.org/jira/browse/FLINK-34461 Project: Flink Issue Type: Bug Components: Connectors / MongoDB Affects Versions: mongodb-1.1.0 Reporter: Martijn Visser The weekly tests for MongoDB consistently time out for the v1.0 branch while testing Flink 1.18.1 for JDK17: https://github.com/apache/flink-connector-mongodb/actions/runs/7770329490/job/21190387348 https://github.com/apache/flink-connector-mongodb/actions/runs/7858349600/job/21443232301 https://github.com/apache/flink-connector-mongodb/actions/runs/7945225005/job/21691624903 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34435) Bump org.yaml:snakeyaml from 1.31 to 2.2 for flink-connector-elasticsearch
Martijn Visser created FLINK-34435: -- Summary: Bump org.yaml:snakeyaml from 1.31 to 2.2 for flink-connector-elasticsearch Key: FLINK-34435 URL: https://issues.apache.org/jira/browse/FLINK-34435 Project: Flink Issue Type: Technical Debt Components: Connectors / ElasticSearch Reporter: Martijn Visser Assignee: Martijn Visser https://github.com/apache/flink-connector-elasticsearch/pull/90 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34432) Re-enable forkReuse for flink-table-planner
Martijn Visser created FLINK-34432: -- Summary: Re-enable forkReuse for flink-table-planner Key: FLINK-34432 URL: https://issues.apache.org/jira/browse/FLINK-34432 Project: Flink Issue Type: Technical Debt Components: Table SQL / Client, Test Infrastructure, Tests Affects Versions: 1.19.0, 1.18.2, 1.20.0 Reporter: Martijn Visser With FLINK-18356 resolved, we should re-enable forkReuse for flink-table-planner to speed up the tests -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34415) Move away from Kafka-Zookeeper based tests in favor of Kafka-KRaft
Martijn Visser created FLINK-34415: -- Summary: Move away from Kafka-Zookeeper based tests in favor of Kafka-KRaft Key: FLINK-34415 URL: https://issues.apache.org/jira/browse/FLINK-34415 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser The current Flink Kafka connector still uses Zookeeper for Kafka-based testing. Since Kafka 3.4, KRaft has been marked as production ready [1]. In order to reduce tech debt, we should remove all the dependencies on Zookeeper and only uses KRaft for the Flink Kafka connector. [1] https://cwiki.apache.org/confluence/display/KAFKA/KIP-833%3A+Mark+KRaft+as+Production+Ready -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34413) Drop support for HBase v1
Martijn Visser created FLINK-34413: -- Summary: Drop support for HBase v1 Key: FLINK-34413 URL: https://issues.apache.org/jira/browse/FLINK-34413 Project: Flink Issue Type: Technical Debt Components: Connectors / HBase Reporter: Martijn Visser As discussed in https://lists.apache.org/thread/6663052dmfnqm8wvqoxx9k8jwcshg1zq -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34368) Update GCS filesystems to latest available version v3.0
Martijn Visser created FLINK-34368: -- Summary: Update GCS filesystems to latest available version v3.0 Key: FLINK-34368 URL: https://issues.apache.org/jira/browse/FLINK-34368 Project: Flink Issue Type: Technical Debt Components: FileSystems Reporter: Martijn Visser Assignee: Martijn Visser Update to https://github.com/GoogleCloudDataproc/hadoop-connectors/releases/tag/v3.0.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34366) Add support to group rows by column ordinals
Martijn Visser created FLINK-34366: -- Summary: Add support to group rows by column ordinals Key: FLINK-34366 URL: https://issues.apache.org/jira/browse/FLINK-34366 Project: Flink Issue Type: New Feature Components: Table SQL / API Reporter: Martijn Visser Reference: BigQuery https://cloud.google.com/bigquery/docs/reference/standard-sql/query-syntax#group_by_col_ordinals The GROUP BY clause can refer to expression names in the SELECT list. The GROUP BY clause also allows ordinal references to expressions in the SELECT list, using integer values. 1 refers to the first value in the SELECT list, 2 the second, and so forth. The value list can combine ordinals and value names. The following queries are equivalent: {code:sql} WITH PlayerStats AS ( SELECT 'Adams' as LastName, 'Noam' as FirstName, 3 as PointsScored UNION ALL SELECT 'Buchanan', 'Jie', 0 UNION ALL SELECT 'Coolidge', 'Kiran', 1 UNION ALL SELECT 'Adams', 'Noam', 4 UNION ALL SELECT 'Buchanan', 'Jie', 13) SELECT SUM(PointsScored) AS total_points, LastName, FirstName FROM PlayerStats GROUP BY LastName, FirstName; /*--+--+---+ | total_points | LastName | FirstName | +--+--+---+ | 7| Adams| Noam | | 13 | Buchanan | Jie | | 1| Coolidge | Kiran | +--+--+---*/ {code} {code:sql} WITH PlayerStats AS ( SELECT 'Adams' as LastName, 'Noam' as FirstName, 3 as PointsScored UNION ALL SELECT 'Buchanan', 'Jie', 0 UNION ALL SELECT 'Coolidge', 'Kiran', 1 UNION ALL SELECT 'Adams', 'Noam', 4 UNION ALL SELECT 'Buchanan', 'Jie', 13) SELECT SUM(PointsScored) AS total_points, LastName, FirstName FROM PlayerStats GROUP BY 2, 3; /*--+--+---+ | total_points | LastName | FirstName | +--+--+---+ | 7| Adams| Noam | | 13 | Buchanan | Jie | | 1| Coolidge | Kiran | +--+--+---*/ {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34358) flink-connector-jdbc nightly fails with "Expecting code to raise a throwable"
Martijn Visser created FLINK-34358: -- Summary: flink-connector-jdbc nightly fails with "Expecting code to raise a throwable" Key: FLINK-34358 URL: https://issues.apache.org/jira/browse/FLINK-34358 Project: Flink Issue Type: Bug Components: Connectors / JDBC Reporter: Martijn Visser https://github.com/apache/flink-connector-jdbc/actions/runs/7770283211/job/21190280602#step:14:346 {code:java} [INFO] Running org.apache.flink.connector.jdbc.dialect.cratedb.CrateDBDialectTypeTest Error: Tests run: 19, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.554 s <<< FAILURE! - in org.apache.flink.connector.jdbc.dialect.cratedb.CrateDBDialectTypeTest Error: org.apache.flink.connector.jdbc.dialect.cratedb.CrateDBDialectTypeTest.testDataTypeValidate(TestItem)[19] Time elapsed: 0.018 s <<< FAILURE! java.lang.AssertionError: Expecting code to raise a throwable. [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.apache.flink.connector.jdbc.catalog.JdbcCatalogUtilsTest [INFO] Running org.apache.flink.architecture.ProductionCodeArchitectureTest [INFO] Running org.apache.flink.architecture.ProductionCodeArchitectureBase [INFO] Running org.apache.flink.architecture.rules.ApiAnnotationRules [INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.155 s - in org.apache.flink.connector.jdbc.dialect.JdbcDialectTypeTest [INFO] Running org.apache.flink.architecture.TestCodeArchitectureTest [INFO] Running org.apache.flink.architecture.TestCodeArchitectureTestBase [INFO] Running org.apache.flink.architecture.rules.ITCaseRules [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.109 s - in org.apache.flink.architecture.rules.ApiAnnotationRules [INFO] Running org.apache.flink.architecture.rules.TableApiRules [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.024 s - in org.apache.flink.architecture.rules.TableApiRules [INFO] Running org.apache.flink.architecture.rules.ConnectorRules [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.31 s - in org.apache.flink.architecture.rules.ConnectorRules [INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.464 s - in org.apache.flink.architecture.ProductionCodeArchitectureBase [INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.468 s - in org.apache.flink.architecture.ProductionCodeArchitectureTest [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.758 s - in org.apache.flink.architecture.rules.ITCaseRules [INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.761 s - in org.apache.flink.architecture.TestCodeArchitectureTestBase [INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.775 s - in org.apache.flink.architecture.TestCodeArchitectureTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 110.38 s - in org.apache.flink.connector.jdbc.databases.oracle.xa.OracleExactlyOnceSinkE2eTest [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 172.591 s - in org.apache.flink.connector.jdbc.databases.db2.xa.Db2ExactlyOnceSinkE2eTest [INFO] [INFO] Results: [INFO] Error: Failures: Error:PostgresDialectTypeTest>JdbcDialectTypeTest.testDataTypeValidate:102 Expecting code to raise a throwable. Error:TrinoDialectTypeTest>JdbcDialectTypeTest.testDataTypeValidate:102 Expecting code to raise a throwable. Error:CrateDBDialectTypeTest>JdbcDialectTypeTest.testDataTypeValidate:102 Expecting code to raise a throwable. [INFO] Error: Tests run: 394, Failures: 3, Errors: 0, Skipped: 1 {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34320) Flink Kafka connector tests time out
Martijn Visser created FLINK-34320: -- Summary: Flink Kafka connector tests time out Key: FLINK-34320 URL: https://issues.apache.org/jira/browse/FLINK-34320 Project: Flink Issue Type: Bug Components: Connectors / Kafka Affects Versions: kafka-3.1.0 Reporter: Martijn Visser https://github.com/apache/flink-connector-kafka/actions/runs/7700171105/job/20987805277?pr=83#step:14:61746 {code:java} 2024-01-29T19:45:07.4412975Z 19:45:07,094 [main] INFO org.apache.kafka.common.utils.AppInfoParser [] - App info kafka.producer for producer-client-id unregistered 2024-01-29T19:45:07.4413978Z 19:45:07,097 [main] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl [] - FileChannelManager removed spill file directory /tmp/flink-io-3306202c-1639-4b7b-a54c-381826e3682e 2024-01-29T19:45:07.4414533Z 19:45:07,440 [main] INFO org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerMigrationTest [] - 2024-01-29T19:45:07.4414785Z 2024-01-29T19:45:07.4415494Z Test testRestoreProducer[Migration Savepoint: 1.16](org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerMigrationTest) successfully run. 2024-01-29T19:45:07.4415646Z 2024-01-29T19:45:07.4698277Z [WARNING] Tests run: 18, Failures: 0, Errors: 0, Skipped: 9, Time elapsed: 206.197 s - in org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerMigrationTest 2024-01-29T20:30:32.8459835Z ##[error]The action has timed out. {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34314) Update CI Node Actions from NodeJS 16 to NodeJS 20
Martijn Visser created FLINK-34314: -- Summary: Update CI Node Actions from NodeJS 16 to NodeJS 20 Key: FLINK-34314 URL: https://issues.apache.org/jira/browse/FLINK-34314 Project: Flink Issue Type: Technical Debt Components: Build System / CI Reporter: Martijn Visser Assignee: Martijn Visser {code:java} Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: actions/checkout@v3, actions/setup-java@v3, stCarolas/setup-maven@v4.5, actions/cache/restore@v3, actions/cache/save@v3. {code} For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34260) Make flink-connector-aws compatible with SinkV2 changes
Martijn Visser created FLINK-34260: -- Summary: Make flink-connector-aws compatible with SinkV2 changes Key: FLINK-34260 URL: https://issues.apache.org/jira/browse/FLINK-34260 Project: Flink Issue Type: Bug Components: Connectors / AWS Affects Versions: aws-connector-4.3.0 Reporter: Martijn Visser https://github.com/apache/flink-connector-aws/actions/runs/7689300085/job/20951547366#step:9:798 {code:java} Error: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile (default-testCompile) on project flink-connector-dynamodb: Compilation failure Error: /home/runner/work/flink-connector-aws/flink-connector-aws/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/DynamoDbSinkWriterTest.java:[357,40] incompatible types: org.apache.flink.connector.base.sink.writer.TestSinkInitContext cannot be converted to org.apache.flink.api.connector.sink2.Sink.InitContext {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34259) flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled
Martijn Visser created FLINK-34259: -- Summary: flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled Key: FLINK-34259 URL: https://issues.apache.org/jira/browse/FLINK-34259 Project: Flink Issue Type: Bug Components: Connectors / JDBC Reporter: Martijn Visser https://github.com/apache/flink-connector-jdbc/actions/runs/7682035724/job/20935884874#step:14:150 {code:java} Error: Tests run: 10, Failures: 5, Errors: 4, Skipped: 0, Time elapsed: 7.909 s <<< FAILURE! - in org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest Error: org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat Time elapsed: 3.254 s <<< ERROR! java.lang.NullPointerException: Cannot invoke "org.apache.flink.api.common.serialization.SerializerConfig.hasGenericTypesDisabled()" because "config" is null at org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:85) at org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:99) at org.apache.flink.connector.jdbc.JdbcTestBase.getSerializer(JdbcTestBase.java:70) at org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat(JdbcRowOutputFormatTest.java:336) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) {code} Seems to be caused by FLINK-34122 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34244) Upgrade Confluent Platform to latest compatible version
Martijn Visser created FLINK-34244: -- Summary: Upgrade Confluent Platform to latest compatible version Key: FLINK-34244 URL: https://issues.apache.org/jira/browse/FLINK-34244 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka, Formats (JSON, Avro, Parquet, ORC, SequenceFile) Affects Versions: kafka-3.1.0, 1.19.0 Reporter: Martijn Visser Assignee: Martijn Visser Flink uses Confluent Platform for its Confluent Avro Schema Registry implementation, and we can update that to the latest version. It's also used by the Flink Kafka connector, and we should upgrade it to the latest compatible version of the used Kafka Client (in this case, 7.4.x) -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34193) Remove usage of Flink-Shaded Jackson and Snakeyaml in flink-connector-kafka
Martijn Visser created FLINK-34193: -- Summary: Remove usage of Flink-Shaded Jackson and Snakeyaml in flink-connector-kafka Key: FLINK-34193 URL: https://issues.apache.org/jira/browse/FLINK-34193 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser The Flink Kafka connector doesn't have a direct dependency in the POM on flink-shaded, but it still uses the shaded versions of Jackson and SnakeYAML in {{YamlFileMetaDataService.java}} and {{KafkaRecordDeserializationSchemaTest}} Those cause problems when trying to compile the Flink Kafka connector for Flink 1.19, since these dependencies have been updated in there. Since connectors shouldn't rely on Flink-Shaded, we should refactor these implementations -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34192) Update flink-connector-kafka to be compatible with updated SinkV2 interfaces
Martijn Visser created FLINK-34192: -- Summary: Update flink-connector-kafka to be compatible with updated SinkV2 interfaces Key: FLINK-34192 URL: https://issues.apache.org/jira/browse/FLINK-34192 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser {code:java} Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/table/KafkaTableTestUtils.java:[101,76] incompatible types: java.util.List cannot be converted to java.util.List Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[136,46] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup,org.apache.flink.metrics.groups.OperatorIOMetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[171,46] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[204,54] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[233,54] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[263,54] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[294,54] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[337,54] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/sink/KafkaWriterITCase.java:[525,46] cannot find symbol Error:symbol: method mock(org.apache.flink.metrics.MetricGroup) Error:location: class org.apache.flink.runtime.metrics.groups.InternalSinkWriterMetricGroup Error: -> [Help 1] {code} https://github.com/apache/flink-connector-kafka/actions/runs/7597711401/job/20692858078#step:14:221 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34154) Bump org.apache.zookeeper:zookeeper from 3.5.9 to 3.7.2 for Kafka connector
Martijn Visser created FLINK-34154: -- Summary: Bump org.apache.zookeeper:zookeeper from 3.5.9 to 3.7.2 for Kafka connector Key: FLINK-34154 URL: https://issues.apache.org/jira/browse/FLINK-34154 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser Assignee: Martijn Visser The Flink Kafka connector still uses Zookeeper but only for tests. Version 3.5.9 has a CVE, we should bump this to avoid getting falsely flagged for this vulnerability -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34149) Flink Kafka connector can't compile against 1.19-SNAPSHOT
Martijn Visser created FLINK-34149: -- Summary: Flink Kafka connector can't compile against 1.19-SNAPSHOT Key: FLINK-34149 URL: https://issues.apache.org/jira/browse/FLINK-34149 Project: Flink Issue Type: Bug Components: Connectors / Kafka, Runtime / Checkpointing Affects Versions: 1.19.0 Reporter: Martijn Visser The Flink Kafka connector for {{main}} fails for 1.19-SNAPSHOT, see https://github.com/apache/flink-connector-kafka/actions/runs/7569481434/job/20612876543#step:14:134 {code:java} Error: COMPILATION ERROR : [INFO] - Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/main/java/org/apache/flink/connector/kafka/dynamic/source/enumerator/StoppableKafkaEnumContextProxy.java:[65,8] org.apache.flink.connector.kafka.dynamic.source.enumerator.StoppableKafkaEnumContextProxy is not abstract and does not override abstract method setIsProcessingBacklog(boolean) in org.apache.flink.api.connector.source.SplitEnumeratorContext {code} This interface seems to be added as part of https://issues.apache.org/jira/browse/FLINK-32514 / https://cwiki.apache.org/confluence/display/FLINK/FLIP-309%3A+Support+using+larger+checkpointing+interval+when+source+is+processing+backlog The FLIP indicates that the changes should be backward compatible, but that appears to have not been the case -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34113) Update flink-connector-elasticsearch to be compatible with updated SinkV2 interfaces
Martijn Visser created FLINK-34113: -- Summary: Update flink-connector-elasticsearch to be compatible with updated SinkV2 interfaces Key: FLINK-34113 URL: https://issues.apache.org/jira/browse/FLINK-34113 Project: Flink Issue Type: Technical Debt Components: Connectors / ElasticSearch Reporter: Martijn Visser Fix For: elasticsearch-3.2.0 Make sure that the connector is updated to deal with the new changes introduced in FLINK-33973 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34111) Add JSON_QUOTE and JSON_UNQUOTE function
Martijn Visser created FLINK-34111: -- Summary: Add JSON_QUOTE and JSON_UNQUOTE function Key: FLINK-34111 URL: https://issues.apache.org/jira/browse/FLINK-34111 Project: Flink Issue Type: Sub-task Components: Table SQL / API Reporter: Martijn Visser Escapes or unescapes a JSON string removing traces of offending characters that could prevent parsing. Proposal: - JSON_QUOTE: Quotes a string by wrapping it with double quote characters and escaping interior quote and other characters, then returning the result as a utf8mb4 string. Returns NULL if the argument is NULL. - JSON_UNQUOTE: Unquotes value and returns the result as a string. Returns NULL if the argument is NULL. An error occurs if the value starts and ends with double quotes but is not a valid JSON string literal. The following characters are reserved in JSON and must be properly escaped to be used in strings: Backspace is replaced with \b Form feed is replaced with \f Newline is replaced with \n Carriage return is replaced with \r Tab is replaced with \t Double quote is replaced with \" Backslash is replaced with \\ This function exists in MySQL: - https://dev.mysql.com/doc/refman/8.0/en/json-creation-functions.html#function_json-quote - https://dev.mysql.com/doc/refman/8.0/en/json-modification-functions.html#function_json-unquote It's still open in Calcite CALCITE-3130 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34108) Add URL_ENCODE and URL_DECODE function
Martijn Visser created FLINK-34108: -- Summary: Add URL_ENCODE and URL_DECODE function Key: FLINK-34108 URL: https://issues.apache.org/jira/browse/FLINK-34108 Project: Flink Issue Type: New Feature Components: Table SQL / API Reporter: Martijn Visser Add URL_ENCODE and URL_DECODE function URL_ENCODE(str) - Translates a string into 'application/x-www-form-urlencoded' format using a specific encoding scheme. URL_DECODE(str) - Decodes a string in 'application/x-www-form-urlencoded' format using a specific encoding scheme. Related ticket from Calcite: CALCITE-5825 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34020) Bump CI flink version on flink-connector-rabbitmq
Martijn Visser created FLINK-34020: -- Summary: Bump CI flink version on flink-connector-rabbitmq Key: FLINK-34020 URL: https://issues.apache.org/jira/browse/FLINK-34020 Project: Flink Issue Type: Technical Debt Components: Connectors/ RabbitMQ Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34019) Bump com.rabbitmq:amqp-client from 5.13.1 to 5.20.0
Martijn Visser created FLINK-34019: -- Summary: Bump com.rabbitmq:amqp-client from 5.13.1 to 5.20.0 Key: FLINK-34019 URL: https://issues.apache.org/jira/browse/FLINK-34019 Project: Flink Issue Type: Technical Debt Components: Connectors/ RabbitMQ Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34003) Bump CI flink version on flink-connector-hbase
Martijn Visser created FLINK-34003: -- Summary: Bump CI flink version on flink-connector-hbase Key: FLINK-34003 URL: https://issues.apache.org/jira/browse/FLINK-34003 Project: Flink Issue Type: Technical Debt Components: Connectors / HBase Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-34002) Bump CI flink version on flink-connector-elasticsearch
Martijn Visser created FLINK-34002: -- Summary: Bump CI flink version on flink-connector-elasticsearch Key: FLINK-34002 URL: https://issues.apache.org/jira/browse/FLINK-34002 Project: Flink Issue Type: Technical Debt Components: Connectors / ElasticSearch Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33964) Flink documentation can't be build due to error in Pulsar docs
Martijn Visser created FLINK-33964: -- Summary: Flink documentation can't be build due to error in Pulsar docs Key: FLINK-33964 URL: https://issues.apache.org/jira/browse/FLINK-33964 Project: Flink Issue Type: Bug Components: Documentation Reporter: Martijn Visser Assignee: Leonard Xu https://github.com/apache/flink/actions/runs/7380766702/job/20078487743 {code:java} Start building sites โฆ hugo v0.110.0-e32a493b7826d02763c3b79623952e625402b168+extended linux/amd64 BuildDate=2023-01-17T12:16:09Z VendorInfo=gohugoio Error: Error building site: "/root/flink/docs/themes/connectors/content.zh/docs/connectors/datastream/pulsar.md:491:1": failed to extract shortcode: template for shortcode "generated/pulsar_admin_configuration" not found {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33956) Bump org.apache.zookeeper:zookeeper from 3.7.1 to 3.7.2
Martijn Visser created FLINK-33956: -- Summary: Bump org.apache.zookeeper:zookeeper from 3.7.1 to 3.7.2 Key: FLINK-33956 URL: https://issues.apache.org/jira/browse/FLINK-33956 Project: Flink Issue Type: Technical Debt Components: Runtime / Coordination Reporter: Martijn Visser Assignee: Martijn Visser Bumps org.apache.zookeeper:zookeeper from 3.7.1 to 3.7.2. Merging this pull request will resolve a critical severity [Dependabot alert|https://github.com/apache/flink/security/dependabot/184] on org.apache.zookeeper:zookeeper. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33953) Bump com.google.guava:guava from 31.1-jre to 33.0.0-jre
Martijn Visser created FLINK-33953: -- Summary: Bump com.google.guava:guava from 31.1-jre to 33.0.0-jre Key: FLINK-33953 URL: https://issues.apache.org/jira/browse/FLINK-33953 Project: Flink Issue Type: Technical Debt Components: Connectors / JDBC Reporter: Martijn Visser Assignee: Martijn Visser Resolves two Dependabot reports https://github.com/apache/flink-connector-jdbc/security/dependabot?q=package%3Acom.google.guava%3Aguava+manifest%3Apom.xml+has%3Apatch by upgrading to the now latest available version -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33952) Flink JDBC connector build fail on JDK 17
Martijn Visser created FLINK-33952: -- Summary: Flink JDBC connector build fail on JDK 17 Key: FLINK-33952 URL: https://issues.apache.org/jira/browse/FLINK-33952 Project: Flink Issue Type: Bug Components: Connectors / JDBC Affects Versions: jdbc-3.2.0 Environment: {code:java} aused by: java.lang.RuntimeException: java.lang.reflect.InaccessibleObjectException: Unable to make field private final java.lang.Object[] java.util.Arrays$ArrayList.a accessible: module java.base does not "opens java.util" to unnamed module @75eeccf5 at com.twitter.chill.java.ArraysAsListSerializer.(ArraysAsListSerializer.java:69) at org.apache.flink.api.java.typeutils.runtime.kryo.FlinkChillPackageRegistrar.registerSerializers(FlinkChillPackageRegistrar.java:67) at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.getKryoInstance(KryoSerializer.java:513) at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.checkKryoInitialized(KryoSerializer.java:522) at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:307) at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:74) at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:50) at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:29) at org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollect(StreamSourceContexts.java:425) at org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collect(StreamSourceContexts.java:520) at org.apache.flink.streaming.api.operators.StreamSourceContexts$SwitchingOnClose.collect(StreamSourceContexts.java:110) at org.apache.flink.connector.jdbc.xa.JdbcExactlyOnceSinkE2eTest$TestEntrySource.emit(JdbcExactlyOnceSinkE2eTest.java:222) at org.apache.flink.connector.jdbc.xa.JdbcExactlyOnceSinkE2eTest$TestEntrySource.emitRange(JdbcExactlyOnceSinkE2eTest.java:207) at org.apache.flink.connector.jdbc.xa.JdbcExactlyOnceSinkE2eTest$TestEntrySource.run(JdbcExactlyOnceSinkE2eTest.java:189) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:114) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:71) at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:338) Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field private final java.lang.Object[] java.util.Arrays$ArrayList.a accessible: module java.base does not "opens java.util" to unnamed module @75eeccf5 at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354) at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297) at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:178) at java.base/java.lang.reflect.Field.setAccessible(Field.java:172) at com.twitter.chill.java.ArraysAsListSerializer.(ArraysAsListSerializer.java:67) ... 16 more {code} https://github.com/apache/flink-connector-jdbc/actions/runs/7311274336/job/19920665057#step:14:533 Reporter: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33704) Update GCS filesystems to latest available versions
Martijn Visser created FLINK-33704: -- Summary: Update GCS filesystems to latest available versions Key: FLINK-33704 URL: https://issues.apache.org/jira/browse/FLINK-33704 Project: Flink Issue Type: Technical Debt Components: Connectors / FileSystem, FileSystems Reporter: Martijn Visser Assignee: Martijn Visser Update GS SDK from 2.15.0 to 2.29.1 and GS Hadoop Connector from 2.2.15 to 2.2.18 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33662) Bump com.h2database:h2
Martijn Visser created FLINK-33662: -- Summary: Bump com.h2database:h2 Key: FLINK-33662 URL: https://issues.apache.org/jira/browse/FLINK-33662 Project: Flink Issue Type: Technical Debt Components: Connectors / JDBC Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33585) Upgrade Zookeeper to 3.7.1
Martijn Visser created FLINK-33585: -- Summary: Upgrade Zookeeper to 3.7.1 Key: FLINK-33585 URL: https://issues.apache.org/jira/browse/FLINK-33585 Project: Flink Issue Type: Technical Debt Components: Runtime / Configuration Reporter: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33584) Update Hadoop Filesystems to 3.3.6
Martijn Visser created FLINK-33584: -- Summary: Update Hadoop Filesystems to 3.3.6 Key: FLINK-33584 URL: https://issues.apache.org/jira/browse/FLINK-33584 Project: Flink Issue Type: Technical Debt Components: Connectors / FileSystem Reporter: Martijn Visser Assignee: Martijn Visser Update the Hadoop filesystems to 3.3.6. Some of the key changes: {code:java} * A big update of dependencies to try and keep those reports of transitive CVEs under control -both genuine and false positives. * Critical fix to ABFS input stream prefetching for correct reading. * Vectored IO API for all FSDataInputStream implementations, with high-performance versions for file:// and s3a:// filesystems. file:// through java native IO s3a:// parallel GET requests. * Arm64 binaries. Note, because the arm64 release was on a different platform, the jar files may not match those of the x86 release -and therefore the maven artifacts. * Security fixes in Hadoopโs own code. {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33567) Flink documentation should only display connector downloads links when a connector is available
Martijn Visser created FLINK-33567: -- Summary: Flink documentation should only display connector downloads links when a connector is available Key: FLINK-33567 URL: https://issues.apache.org/jira/browse/FLINK-33567 Project: Flink Issue Type: Bug Components: Documentation Affects Versions: 1.18.0, 1.17.0 Reporter: Martijn Visser Assignee: Martijn Visser We currently have the situation that: 1. When visiting the master documentation, a message is correctly displayed that there are only connectors available for stable (= released) versions of Flink 2. When visiting the docs for release-1.18 or release-1.17, sometimes download links to non-existing links are displayed, because there's no compatible version of the connector (yet) available. In order to solve this, we should: 1. Add a Flink Compatibility collection to the connector repo doc 2. Use the compatibility collection in the documentation to display the correct links if a connector version for that Flink version is available, and else display the message that there's no connector release available for that Flink version -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33544) Flink documentation fails to build
Martijn Visser created FLINK-33544: -- Summary: Flink documentation fails to build Key: FLINK-33544 URL: https://issues.apache.org/jira/browse/FLINK-33544 Project: Flink Issue Type: Bug Components: Build System, Documentation Reporter: Martijn Visser Assignee: Chesnay Schepler {code:java} hugo README.md LICENSE fatal: detected dubious ownership in repository at '/root/flink' To add an exception for this directory, call: git config --global --add safe.directory /root/flink {code} https://github.com/apache/flink/actions/runs/6861406527/job/18657037038#step:5:153 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33516) Create dedicated PyFlink channel
Martijn Visser created FLINK-33516: -- Summary: Create dedicated PyFlink channel Key: FLINK-33516 URL: https://issues.apache.org/jira/browse/FLINK-33516 Project: Flink Issue Type: Improvement Components: Documentation Reporter: Martijn Visser Assignee: Martijn Visser See https://lists.apache.org/thread/ynb5drhqqbd84w4o4337qv47100cp67h 1. Create new Slack channel 2. Update website -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33400) Pulsar connector doesn't compile for Flink 1.18 due to Archunit update
Martijn Visser created FLINK-33400: -- Summary: Pulsar connector doesn't compile for Flink 1.18 due to Archunit update Key: FLINK-33400 URL: https://issues.apache.org/jira/browse/FLINK-33400 Project: Flink Issue Type: Bug Components: Connectors / Pulsar Affects Versions: pulsar-4.0.1 Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33361) Add Java 17 compatibility to Flink Kafka consumer
Martijn Visser created FLINK-33361: -- Summary: Add Java 17 compatibility to Flink Kafka consumer Key: FLINK-33361 URL: https://issues.apache.org/jira/browse/FLINK-33361 Project: Flink Issue Type: Improvement Components: Connectors / Kafka Affects Versions: kafka-3.0.1, kafka-3.1.0 Reporter: Martijn Visser When currently trying to {{mvn clean install -Dflink.version=1.18.0 -Dscala-2.12 -Prun-end-to-end-tests -DdistDir=/Users/mvisser/Developer/flink-1.18.0 -Dflink.convergence.phase=install -Dlog4j.configurationFile=tools/ci/log4j.properties}} this fails with errors like: {code:java} [INFO] [INFO] Results: [INFO] [ERROR] Errors: [ERROR] FlinkKafkaConsumerBaseMigrationTest.testRestore [ERROR] Run 1: Exception while creating StreamOperatorStateContext. [ERROR] Run 2: Exception while creating StreamOperatorStateContext. [ERROR] Run 3: Exception while creating StreamOperatorStateContext. [ERROR] Run 4: Exception while creating StreamOperatorStateContext. [ERROR] Run 5: Exception while creating StreamOperatorStateContext. [ERROR] Run 6: Exception while creating StreamOperatorStateContext. [ERROR] Run 7: Exception while creating StreamOperatorStateContext. [ERROR] Run 8: Exception while creating StreamOperatorStateContext. [ERROR] Run 9: Exception while creating StreamOperatorStateContext. [INFO] [ERROR] FlinkKafkaConsumerBaseTest.testExplicitStateSerializerCompatibility:721 ยป Runtime [ERROR] FlinkKafkaConsumerBaseTest.testScaleDown:742->testRescaling:817 ยป Checkpoint C... [ERROR] FlinkKafkaConsumerBaseTest.testScaleUp:737->testRescaling:817 ยป Checkpoint Cou... [ERROR] UpsertKafkaDynamicTableFactoryTest.testBufferedTableSink:243 ยป UncheckedIO jav... {code} Example stacktrace: {code:java} Test testBufferedTableSink(org.apache.flink.streaming.connectors.kafka.table.UpsertKafkaDynamicTableFactoryTest) failed with: java.io.UncheckedIOException: java.io.IOException: Serializing the source elements failed: java.lang.reflect.InaccessibleObjectException: Unable to make field private final java.lang.Object[] java.util.Arrays$ArrayList.a accessible: module java.base does not "opens java.util" to unnamed module @45b4c3a9 at org.apache.flink.streaming.api.functions.source.FromElementsFunction.setOutputType(FromElementsFunction.java:162) at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.trySetOutputType(StreamingFunctionUtils.java:84) at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.setOutputType(StreamingFunctionUtils.java:60) at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.setOutputType(AbstractUdfStreamOperator.java:146) at org.apache.flink.streaming.api.operators.SimpleOperatorFactory.setOutputType(SimpleOperatorFactory.java:118) at org.apache.flink.streaming.api.graph.StreamGraph.addOperator(StreamGraph.java:434) at org.apache.flink.streaming.api.graph.StreamGraph.addOperator(StreamGraph.java:402) at org.apache.flink.streaming.api.graph.StreamGraph.addLegacySource(StreamGraph.java:356) at org.apache.flink.streaming.runtime.translators.LegacySourceTransformationTranslator.translateInternal(LegacySourceTransformationTranslator.java:66) at org.apache.flink.streaming.runtime.translators.LegacySourceTransformationTranslator.translateForStreamingInternal(LegacySourceTransformationTranslator.java:53) at org.apache.flink.streaming.runtime.translators.LegacySourceTransformationTranslator.translateForStreamingInternal(LegacySourceTransformationTranslator.java:40) at org.apache.flink.streaming.api.graph.SimpleTransformationTranslator.translateForStreaming(SimpleTransformationTranslator.java:62) at org.apache.flink.streaming.api.graph.StreamGraphGenerator.translate(StreamGraphGenerator.java:860) at org.apache.flink.streaming.api.graph.StreamGraphGenerator.transform(StreamGraphGenerator.java:590) at org.apache.flink.streaming.api.graph.StreamGraphGenerator.getParentInputIds(StreamGraphGenerator.java:881) at org.apache.flink.streaming.api.graph.StreamGraphGenerator.translate(StreamGraphGenerator.java:839) at org.apache.flink.streaming.api.graph.StreamGraphGenerator.transform(StreamGraphGenerator.java:590) at org.apache.flink.streaming.api.graph.StreamGraphGenerator.generate(StreamGraphGenerator.java:328) at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:2289) at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:2280) at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:2266) at
[jira] [Created] (FLINK-33339) Update Guava to 32.1.3
Martijn Visser created FLINK-9: -- Summary: Update Guava to 32.1.3 Key: FLINK-9 URL: https://issues.apache.org/jira/browse/FLINK-9 Project: Flink Issue Type: Technical Debt Components: BuildSystem / Shaded Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33336) Upgrade ASM to 9.6
Martijn Visser created FLINK-6: -- Summary: Upgrade ASM to 9.6 Key: FLINK-6 URL: https://issues.apache.org/jira/browse/FLINK-6 Project: Flink Issue Type: Technical Debt Components: BuildSystem / Shaded Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33250) HBase Connector should directly depend on 3rd-party libs instead of flink-shaded repo
Martijn Visser created FLINK-33250: -- Summary: HBase Connector should directly depend on 3rd-party libs instead of flink-shaded repo Key: FLINK-33250 URL: https://issues.apache.org/jira/browse/FLINK-33250 Project: Flink Issue Type: Sub-task Components: Connectors / HBase Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33238) Upgrade org.apache.avro:avro to 1.11.3 to mitigate CVE-2023-39410
Martijn Visser created FLINK-33238: -- Summary: Upgrade org.apache.avro:avro to 1.11.3 to mitigate CVE-2023-39410 Key: FLINK-33238 URL: https://issues.apache.org/jira/browse/FLINK-33238 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka, Formats (JSON, Avro, Parquet, ORC, SequenceFile) Reporter: Martijn Visser Assignee: Martijn Visser We should update AVRO to 1.11.3 to avoid false-positives on CVE-2023-39410 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33234) Bump used Guava version in Kafka E2E tests
Martijn Visser created FLINK-33234: -- Summary: Bump used Guava version in Kafka E2E tests Key: FLINK-33234 URL: https://issues.apache.org/jira/browse/FLINK-33234 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser Assignee: Martijn Visser To resolve existing Dependabot PRs: https://github.com/apache/flink-connector-kafka/security/dependabot?q=package%3Acom.google.guava%3Aguava+manifest%3Aflink-connector-kafka-e2e-tests%2Fflink-end-to-end-tests-common-kafka%2Fpom.xml+has%3Apatch -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33175) Nightly builds from S3 are not available for download, breaking all connector tests
Martijn Visser created FLINK-33175: -- Summary: Nightly builds from S3 are not available for download, breaking all connector tests Key: FLINK-33175 URL: https://issues.apache.org/jira/browse/FLINK-33175 Project: Flink Issue Type: Bug Components: Connectors / Common Reporter: Martijn Visser All downloads of Flink binaries fail with: {code:java} Run wget -q -c https://s3.amazonaws.com/flink-nightly/flink-1.18-SNAPSHOT-bin-scala_2.12.tgz -O - | tar -xz gzip: stdin: unexpected end of file tar: Child returned status 1 tar: Error is not recoverable: exiting now Error: Process completed with exit code 2. {code} This goes for 1.18, but also 1.17 and 1.16 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33104) Nightly run for Flink Kafka connector fails
Martijn Visser created FLINK-33104: -- Summary: Nightly run for Flink Kafka connector fails Key: FLINK-33104 URL: https://issues.apache.org/jira/browse/FLINK-33104 Project: Flink Issue Type: Bug Components: Connectors / Kafka Affects Versions: kafka-3.1.0 Reporter: Martijn Visser {code:java} 2023-09-17T00:29:07.1675694Z [WARNING] Tests run: 18, Failures: 0, Errors: 0, Skipped: 9, Time elapsed: 308.532 s - in org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerMigrationTest 2023-09-17T00:29:07.5171608Z [INFO] 2023-09-17T00:29:07.5172360Z [INFO] Results: 2023-09-17T00:29:07.5172773Z [INFO] 2023-09-17T00:29:07.5173139Z [ERROR] Failures: 2023-09-17T00:29:07.5174181Z [ERROR] Architecture Violation [Priority: MEDIUM] - Rule 'ITCASE tests should use a MiniCluster resource or extension' was violated (13 times): 2023-09-17T00:29:07.5176050Z org.apache.flink.connector.kafka.sink.FlinkKafkaInternalProducerITCase does not satisfy: only one of the following predicates match: 2023-09-17T00:29:07.5177452Z * reside in a package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type InternalMiniClusterExtension and annotated with @RegisterExtension 2023-09-17T00:29:07.5179831Z * reside outside of package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type MiniClusterExtension and annotated with @RegisterExtension or are , and of type MiniClusterTestEnvironment and annotated with @TestEnv 2023-09-17T00:29:07.5181277Z * reside in a package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class InternalMiniClusterExtension 2023-09-17T00:29:07.5182154Z * reside outside of package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class MiniClusterExtension 2023-09-17T00:29:07.5182951Z or contain any fields that are public, static, and of type MiniClusterWithClientResource and final and annotated with @ClassRule or contain any fields that is of type MiniClusterWithClientResource and public and final and not static and annotated with @Rule 2023-09-17T00:29:07.5183906Z org.apache.flink.connector.kafka.sink.KafkaSinkITCase does not satisfy: only one of the following predicates match: 2023-09-17T00:29:07.5184769Z * reside in a package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type InternalMiniClusterExtension and annotated with @RegisterExtension 2023-09-17T00:29:07.5185812Z * reside outside of package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type MiniClusterExtension and annotated with @RegisterExtension or are , and of type MiniClusterTestEnvironment and annotated with @TestEnv 2023-09-17T00:29:07.5186880Z * reside in a package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class InternalMiniClusterExtension 2023-09-17T00:29:07.5187929Z * reside outside of package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class MiniClusterExtension 2023-09-17T00:29:07.5189073Z or contain any fields that are public, static, and of type MiniClusterWithClientResource and final and annotated with @ClassRule or contain any fields that is of type MiniClusterWithClientResource and public and final and not static and annotated with @Rule 2023-09-17T00:29:07.5190076Z org.apache.flink.connector.kafka.sink.KafkaTransactionLogITCase does not satisfy: only one of the following predicates match: 2023-09-17T00:29:07.5190946Z * reside in a package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type InternalMiniClusterExtension and annotated with @RegisterExtension 2023-09-17T00:29:07.5191983Z * reside outside of package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type MiniClusterExtension and annotated with @RegisterExtension or are , and of type MiniClusterTestEnvironment and annotated with @TestEnv 2023-09-17T00:29:07.5192845Z * reside in a package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class InternalMiniClusterExtension 2023-09-17T00:29:07.5193532Z * reside outside of package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class MiniClusterExtension 2023-09-17T00:29:07.5194300Z or contain any fields that are public, static, and of type MiniClusterWithClientResource and final and annotated with @ClassRule or contain any fields that is of type MiniClusterWithClientResource and public and final and not static and annotated with @Rule 2023-09-17T00:29:07.5195091Z org.apache.flink.connector.kafka.sink.KafkaWriterITCase does not satisfy: only one of the following predicates match: 2023-09-17T00:29:07.5195938Z * reside in a package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type
[jira] [Created] (FLINK-33075) Notice files for Statefun are outdated
Martijn Visser created FLINK-33075: -- Summary: Notice files for Statefun are outdated Key: FLINK-33075 URL: https://issues.apache.org/jira/browse/FLINK-33075 Project: Flink Issue Type: Bug Components: Stateful Functions Affects Versions: statefun-3.3.0 Reporter: Martijn Visser Assignee: Martijn Visser {code:java} - NOTICE files are present - Note: The copyright year is out of data (2020) - Concern: we bundle AnchorJS (MIT) v3.1.0 and this is not listed in the NOTICE file - Concern: "statefun-sdk-java" bundles "com.google.auto.service:auto-service-annotations:jar:1.0-rc6" but does not declare it in the NOTICE - Concern: "statefun-flink-distribution" - bundles "org.apache.kafka:kafka-clients:3.2.3" but declares "org.apache.kafka:kafka-clients:2.4.1" - bundles "com.github.luben:zstd-jni:1.5.2-1" but declares "com.github.luben:zstd-jni:1.4.3-1" - bundles "com.fasterxml.jackson.core:jackson-core:2.13.4" but declares "com.fasterxml.jackson.core:jackson-core:2.12.1" - bundles "com.fasterxml.jackson.core:jackson-annotations:2.13.4" but declares "com.fasterxml.jackson.core:jackson-annotations:2.12.1" - bundles "com.fasterxml.jackson.core:jackson-databind:2.13.4.2" but declares "com.fasterxml.jackson.core:jackson-databind:2.12.1" - bundles "com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:2.13.4" but declares "com.fasterxml.jackson.dataformat:jackson-dataformat-cbor:2.12.1" - bundles "commons-io:commons-io:jar:2.11.0" but declares "commons-io:commons-io:jar:2.8.0" - bundles "commons-codec:commons-codec:1.15" but declares "commons-codec:commons-codec:1.13" - bundles "com.esotericsoftware.minlog:minlog:1.2" but does not declare it - bundles "com.ibm.icu:icu4j:jar:67.1" but does not declare it - bundles "org.objenesis:objenesis:jar:2.1" but does not declare it - bundles "com.esotericsoftware.kryo:kryo:2.24.0" but does not declare it - bundles "commons-collections:commons-collections:3.2.2" but does not declare it - bundles "org.apache.commons:commons-compress:1.21" but does not declare it {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33021) AWS nightly builds fails on architecture tests
Martijn Visser created FLINK-33021: -- Summary: AWS nightly builds fails on architecture tests Key: FLINK-33021 URL: https://issues.apache.org/jira/browse/FLINK-33021 Project: Flink Issue Type: Bug Components: Connectors / AWS Affects Versions: aws-connector-4.2.0 Reporter: Martijn Visser https://github.com/apache/flink-connector-aws/actions/runs/6067488560/job/16459208589#step:9:879 {code:java} Error: Failures: Error:Architecture Violation [Priority: MEDIUM] - Rule 'ITCASE tests should use a MiniCluster resource or extension' was violated (1 times): org.apache.flink.connector.firehose.sink.KinesisFirehoseSinkITCase does not satisfy: only one of the following predicates match: * reside in a package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type InternalMiniClusterExtension and annotated with @RegisterExtension * reside outside of package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type MiniClusterExtension and annotated with @RegisterExtension or are , and of type MiniClusterTestEnvironment and annotated with @TestEnv * reside in a package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class InternalMiniClusterExtension * reside outside of package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class MiniClusterExtension or contain any fields that are public, static, and of type MiniClusterWithClientResource and final and annotated with @ClassRule or contain any fields that is of type MiniClusterWithClientResource and public and final and not static and annotated with @Rule [INFO] Error: Tests run: 21, Failures: 1, Errors: 0, Skipped: 0 {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33020) OpensearchSinkTest.testAtLeastOnceSink timed out
Martijn Visser created FLINK-33020: -- Summary: OpensearchSinkTest.testAtLeastOnceSink timed out Key: FLINK-33020 URL: https://issues.apache.org/jira/browse/FLINK-33020 Project: Flink Issue Type: Bug Components: Connectors / Opensearch Affects Versions: opensearch-1.0.2 Reporter: Martijn Visser https://github.com/apache/flink-connector-opensearch/actions/runs/6061205003/job/16446139552#step:13:1029 {code:java} Error: Tests run: 9, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 9.837 s <<< FAILURE! - in org.apache.flink.streaming.connectors.opensearch.OpensearchSinkTest Error: org.apache.flink.streaming.connectors.opensearch.OpensearchSinkTest.testAtLeastOnceSink Time elapsed: 5.022 s <<< ERROR! java.util.concurrent.TimeoutException: testAtLeastOnceSink() timed out after 5 seconds at org.junit.jupiter.engine.extension.TimeoutInvocation.createTimeoutException(TimeoutInvocation.java:70) at org.junit.jupiter.engine.extension.TimeoutInvocation.proceed(TimeoutInvocation.java:59) at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140) at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84) at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115) at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105) at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45) at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37) at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104) at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:214) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:210) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:135) at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:66) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1541) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at
[jira] [Created] (FLINK-33019) Pulsar tests hangs during nightly builds
Martijn Visser created FLINK-33019: -- Summary: Pulsar tests hangs during nightly builds Key: FLINK-33019 URL: https://issues.apache.org/jira/browse/FLINK-33019 Project: Flink Issue Type: Bug Components: Connectors / Pulsar Reporter: Martijn Visser https://github.com/apache/flink-connector-pulsar/actions/runs/6067569890/job/16459404675#step:13:25195 The thread dump shows multiple parked/sleeping threads. No clear indicator of what's wrong -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33018) GCP Pubsub PubSubConsumingTest.testStoppingConnectorWhenDeserializationSchemaIndicatesEndOfStream failed
Martijn Visser created FLINK-33018: -- Summary: GCP Pubsub PubSubConsumingTest.testStoppingConnectorWhenDeserializationSchemaIndicatesEndOfStream failed Key: FLINK-33018 URL: https://issues.apache.org/jira/browse/FLINK-33018 Project: Flink Issue Type: Bug Components: Connectors / Google Cloud PubSub Affects Versions: gcp-pubsub-3.0.2 Reporter: Martijn Visser https://github.com/apache/flink-connector-gcp-pubsub/actions/runs/6061318336/job/16446392844#step:13:507 {code:java} [INFO] [INFO] Results: [INFO] Error: Failures: Error: PubSubConsumingTest.testStoppingConnectorWhenDeserializationSchemaIndicatesEndOfStream:119 expected: ["1", "2", "3"] but was: ["1", "2"] [INFO] Error: Tests run: 30, Failures: 1, Errors: 0, Skipped: 0 [INFO] [INFO] {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-33017) Nightly run for Flink Kafka connector fails
Martijn Visser created FLINK-33017: -- Summary: Nightly run for Flink Kafka connector fails Key: FLINK-33017 URL: https://issues.apache.org/jira/browse/FLINK-33017 Project: Flink Issue Type: Bug Components: Connectors / Kafka Affects Versions: kafka-3.1.0 Reporter: Martijn Visser https://github.com/apache/flink-connector-kafka/actions/runs/6061283403/job/16446313350#step:13:54462 {code:java} 2023-09-03T00:29:28.8942615Z [ERROR] Errors: 2023-09-03T00:29:28.8942799Z [ERROR] FlinkKafkaConsumerBaseMigrationTest.testRestore 2023-09-03T00:29:28.8943079Z [ERROR] Run 1: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8943342Z [ERROR] Run 2: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8943604Z [ERROR] Run 3: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8943903Z [ERROR] Run 4: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8944164Z [ERROR] Run 5: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8944419Z [ERROR] Run 6: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8944714Z [ERROR] Run 7: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8944970Z [ERROR] Run 8: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8945221Z [ERROR] Run 9: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8945294Z [INFO] 2023-09-03T00:29:28.8945577Z [ERROR] FlinkKafkaConsumerBaseMigrationTest.testRestoreFromEmptyStateNoPartitions 2023-09-03T00:29:28.8945769Z [ERROR] Run 1: org/apache/flink/shaded/guava31/com/google/common/collect/ImmutableList 2023-09-03T00:29:28.8946019Z [ERROR] Run 2: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8946266Z [ERROR] Run 3: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8946525Z [ERROR] Run 4: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8946778Z [ERROR] Run 5: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8947027Z [ERROR] Run 6: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8947269Z [ERROR] Run 7: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8947516Z [ERROR] Run 8: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8947765Z [ERROR] Run 9: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8947834Z [INFO] 2023-09-03T00:29:28.8948117Z [ERROR] FlinkKafkaConsumerBaseMigrationTest.testRestoreFromEmptyStateWithPartitions 2023-09-03T00:29:28.8948407Z [ERROR] Run 1: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8948660Z [ERROR] Run 2: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8948949Z [ERROR] Run 3: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8949192Z [ERROR] Run 4: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8949433Z [ERROR] Run 5: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8949673Z [ERROR] Run 6: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8949913Z [ERROR] Run 7: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8950155Z [ERROR] Run 8: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8950518Z [ERROR] Run 9: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8950598Z [INFO] 2023-09-03T00:29:28.8950819Z [ERROR] FlinkKafkaProducerMigrationOperatorTest.testRestoreProducer 2023-09-03T00:29:28.8951072Z [ERROR] Run 1: Could not initialize class org.apache.flink.runtime.util.config.memory.ManagedMemoryUtils 2023-09-03T00:29:28.8951318Z [ERROR] Run 2: Could not initialize class
[jira] [Created] (FLINK-33002) Bump snappy-java from 1.1.4 to 1.1.10.1
Martijn Visser created FLINK-33002: -- Summary: Bump snappy-java from 1.1.4 to 1.1.10.1 Key: FLINK-33002 URL: https://issues.apache.org/jira/browse/FLINK-33002 Project: Flink Issue Type: Technical Debt Components: Stateful Functions Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32687) Performance regression on handleGlobalFailureAndRestartAllTasks.BATCH_EVENLY since 2023-07-23
Martijn Visser created FLINK-32687: -- Summary: Performance regression on handleGlobalFailureAndRestartAllTasks.BATCH_EVENLY since 2023-07-23 Key: FLINK-32687 URL: https://issues.apache.org/jira/browse/FLINK-32687 Project: Flink Issue Type: Bug Affects Versions: 1.18.0 Reporter: Martijn Visser http://codespeed.dak8s.net:8000/timeline/#/?exe=5=handleGlobalFailureAndRestartAllTasks.BATCH_EVENLY=on=on=off=2=200 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32686) Benchmark regression on startScheduling.BATCH and startScheduling.STREAMING since 2023-07-24
Martijn Visser created FLINK-32686: -- Summary: Benchmark regression on startScheduling.BATCH and startScheduling.STREAMING since 2023-07-24 Key: FLINK-32686 URL: https://issues.apache.org/jira/browse/FLINK-32686 Project: Flink Issue Type: Bug Affects Versions: 1.18.0 Reporter: Martijn Visser http://codespeed.dak8s.net:8000/timeline/#/?exe=5=startScheduling.STREAMING=on=on=off=2=200 http://codespeed.dak8s.net:8000/timeline/#/?exe=5=startScheduling.BATCH=on=on=off=2=200 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32685) Benchmark regression on sortedMultiInput and sortedTwoInput since 2023-07-18
Martijn Visser created FLINK-32685: -- Summary: Benchmark regression on sortedMultiInput and sortedTwoInput since 2023-07-18 Key: FLINK-32685 URL: https://issues.apache.org/jira/browse/FLINK-32685 Project: Flink Issue Type: Bug Affects Versions: 1.18.0 Reporter: Martijn Visser http://codespeed.dak8s.net:8000/timeline/#/?exe=1=sortedMultiInput=on=on=off=2=200 http://codespeed.dak8s.net:8000/timeline/#/?exe=1=sortedTwoInput=on=on=off=2=200 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32602) Nightly builds for Elasticsearch are failing with "pull access denied for flink-base"
Martijn Visser created FLINK-32602: -- Summary: Nightly builds for Elasticsearch are failing with "pull access denied for flink-base" Key: FLINK-32602 URL: https://issues.apache.org/jira/browse/FLINK-32602 Project: Flink Issue Type: Bug Components: Connectors / ElasticSearch Reporter: Martijn Visser {code:java} Caused by: com.github.dockerjava.api.exception.DockerClientException: Could not build image: pull access denied for flink-base, repository does not exist or may require 'docker login': denied: requested access to the resource is denied {code} https://github.com/apache/flink-connector-elasticsearch/actions/runs/5564892055/jobs/10164792729#step:13:17389 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32599) Benchmarks in Slack have failed consistently since 2023-07-14
Martijn Visser created FLINK-32599: -- Summary: Benchmarks in Slack have failed consistently since 2023-07-14 Key: FLINK-32599 URL: https://issues.apache.org/jira/browse/FLINK-32599 Project: Flink Issue Type: Bug Components: Benchmarks Reporter: Martijn Visser As reported in the #flink-dev-benchmarks channel on Slack, all Jenkins build have failed: Failed build 1457 of flink-master-benchmarks-java8 (Open): hudson.AbortException: script returned exit code 1 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32525) Update commons-beanutils to 1.9.4
Martijn Visser created FLINK-32525: -- Summary: Update commons-beanutils to 1.9.4 Key: FLINK-32525 URL: https://issues.apache.org/jira/browse/FLINK-32525 Project: Flink Issue Type: Technical Debt Components: Deployment / YARN Reporter: Martijn Visser YARN still tests with commons-beanutils 1.8.3 with a remark that beanutil 1.9+ doesn't work with Hadoop, but Hadoop 2.10.2 (which is our minimum supported version) uses beanutils 1.9.4 itself, per https://github.com/apache/hadoop/blob/rel/release-2.10.2/hadoop-project/pom.xml#L861-L863 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32487) https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=50647=logs=0da23115-68bb-5dcd-192c-bd4c8adebde1=24c3384f-1bcb-57b3-224f-51bf973bbee8=86
Martijn Visser created FLINK-32487: -- Summary: https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=50647=logs=0da23115-68bb-5dcd-192c-bd4c8adebde1=24c3384f-1bcb-57b3-224f-51bf973bbee8=8617 Key: FLINK-32487 URL: https://issues.apache.org/jira/browse/FLINK-32487 Project: Flink Issue Type: Bug Components: API / Core Reporter: Martijn Visser {code:java} Jun 29 03:21:25 03:21:25.954 [INFO] Jun 29 03:21:25 03:21:25.954 [ERROR] Errors: Jun 29 03:21:25 03:21:25.954 [ERROR] SourceCoordinatorAlignmentTest.testAnnounceCombinedWatermarkWithoutStart:192 ยป RejectedExecution Jun 29 03:21:25 03:21:25.955 [INFO] {code} https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=50611=logs=0da23115-68bb-5dcd-192c-bd4c8adebde1=24c3384f-1bcb-57b3-224f-51bf973bbee8=8613 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32462) Kafka shouldn't rely on Flink-Shaded
Martijn Visser created FLINK-32462: -- Summary: Kafka shouldn't rely on Flink-Shaded Key: FLINK-32462 URL: https://issues.apache.org/jira/browse/FLINK-32462 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32450) Update Kafka CI setup to latest version for PRs and nightly builds
Martijn Visser created FLINK-32450: -- Summary: Update Kafka CI setup to latest version for PRs and nightly builds Key: FLINK-32450 URL: https://issues.apache.org/jira/browse/FLINK-32450 Project: Flink Issue Type: Technical Debt Components: Connectors / Kafka Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32448) Connector Shared Utils checks out wrong branch when running CI for PRs
Martijn Visser created FLINK-32448: -- Summary: Connector Shared Utils checks out wrong branch when running CI for PRs Key: FLINK-32448 URL: https://issues.apache.org/jira/browse/FLINK-32448 Project: Flink Issue Type: Bug Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32390) [Dependabot] Bump socket.io-parser and socket.io and engine.io
Martijn Visser created FLINK-32390: -- Summary: [Dependabot] Bump socket.io-parser and socket.io and engine.io Key: FLINK-32390 URL: https://issues.apache.org/jira/browse/FLINK-32390 Project: Flink Issue Type: Technical Debt Components: Runtime / Web Frontend Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32389) [Dependabot] Bump guava from 27.0.1-jre to 32.0.0-jre
Martijn Visser created FLINK-32389: -- Summary: [Dependabot] Bump guava from 27.0.1-jre to 32.0.0-jre Key: FLINK-32389 URL: https://issues.apache.org/jira/browse/FLINK-32389 Project: Flink Issue Type: Technical Debt Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile) Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32357) Elasticsearch v3.0 won't compile when testing against Flink 1.17.1
Martijn Visser created FLINK-32357: -- Summary: Elasticsearch v3.0 won't compile when testing against Flink 1.17.1 Key: FLINK-32357 URL: https://issues.apache.org/jira/browse/FLINK-32357 Project: Flink Issue Type: Bug Components: Connectors / ElasticSearch Reporter: Martijn Visser {code:java| [INFO] Error: Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test (default-test) on project flink-connector-elasticsearch-base: Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test failed: org.junit.platform.commons.JUnitException: TestEngine with ID 'archunit' failed to discover tests: com.tngtech.archunit.lang.syntax.elements.MethodsThat.areAnnotatedWith(Ljava/lang/Class;)Ljava/lang/Object; -> [Help 1] {code} https://github.com/apache/flink-connector-elasticsearch/actions/runs/5277721611/jobs/9546112876#step:13:159 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32353) Make Cassandra connector compatible with Flink 1.18
Martijn Visser created FLINK-32353: -- Summary: Make Cassandra connector compatible with Flink 1.18 Key: FLINK-32353 URL: https://issues.apache.org/jira/browse/FLINK-32353 Project: Flink Issue Type: Improvement Components: Connectors / Cassandra Reporter: Martijn Visser The current Cassandra connector in {{main}} fails when testing against Flink 1.18-SNAPSHOT {code:java} Error: Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 8.1 s <<< FAILURE! - in org.apache.flink.architecture.rules.ITCaseRules Error: ITCaseRules.ITCASE_USE_MINICLUSTER Time elapsed: 0.025 s <<< FAILURE! java.lang.AssertionError: Architecture Violation [Priority: MEDIUM] - Rule 'ITCASE tests should use a MiniCluster resource or extension' was violated (1 times): org.apache.flink.streaming.connectors.cassandra.CassandraConnectorITCase does not satisfy: only one of the following predicates match: * reside in a package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type InternalMiniClusterExtension and annotated with @RegisterExtension * reside outside of package 'org.apache.flink.runtime.*' and contain any fields that are static, final, and of type MiniClusterExtension and annotated with @RegisterExtension or are , and of type MiniClusterTestEnvironment and annotated with @TestEnv * reside in a package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class InternalMiniClusterExtension * reside outside of package 'org.apache.flink.runtime.*' and is annotated with @ExtendWith with class MiniClusterExtension or contain any fields that are public, static, and of type MiniClusterWithClientResource and final and annotated with @ClassRule or contain any fields that is of type MiniClusterWithClientResource and public and final and not static and annotated with @Rule {code} https://github.com/apache/flink-connector-cassandra/actions/runs/5276835802/jobs/9544092571#step:13:811 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32348) MongoDB tests are flaky and time out
Martijn Visser created FLINK-32348: -- Summary: MongoDB tests are flaky and time out Key: FLINK-32348 URL: https://issues.apache.org/jira/browse/FLINK-32348 Project: Flink Issue Type: Bug Components: Connectors / MongoDB Reporter: Martijn Visser https://github.com/apache/flink-connector-mongodb/actions/runs/5232649632/jobs/9447519651#step:13:39307 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32331) Print JVM thread dumps when Github Actions workflow gets cancelled/times out
Martijn Visser created FLINK-32331: -- Summary: Print JVM thread dumps when Github Actions workflow gets cancelled/times out Key: FLINK-32331 URL: https://issues.apache.org/jira/browse/FLINK-32331 Project: Flink Issue Type: Improvement Components: Build System / CI, Connectors / Common Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32325) SqlServerDynamicTableSourceITCase is flaky
Martijn Visser created FLINK-32325: -- Summary: SqlServerDynamicTableSourceITCase is flaky Key: FLINK-32325 URL: https://issues.apache.org/jira/browse/FLINK-32325 Project: Flink Issue Type: Bug Components: Connectors / JDBC Affects Versions: jdbc-3.2.0, jdbc-3.1.1 Reporter: Martijn Visser {code:java} [INFO] Running org.apache.flink.connector.jdbc.databases.sqlserver.table.SqlServerDynamicTableSourceITCase [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.533 s - in org.apache.flink.connector.jdbc.JdbcITCase [INFO] Running org.apache.flink.connector.jdbc.databases.sqlserver.table.SqlServerTableSourceITCase Jun 13, 2023 8:49:50 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 39249b3b-40c2-4f71-9598-20abe5e93d2d Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:50 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 2c0e4870-7284-4022-b97d-7f441fc834dd Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:50 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 34e9eff4-445c-477e-8975-d23180897ff8 Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:50 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 0d4fe549-66e7-4354-b7c5-ed7ee66527d2 Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:51 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 4e98d176-2f1f-4dec-af3e-798ecb536c39 Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:52 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 32ba6716-772b-42f3-b27c-9ec1593adcd7 Prelogin error: host localhost port 32783 Error reading prelogin response: Connection reset ClientConnectionId:32ba6716-772b-42f3-b27c-9ec1593adcd7 Jun 13, 2023 8:49:53 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: fe5e363f-fade-48b8-beb1-9f2e3a524282 Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:54 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: b454a476-5f05-4cd7-bb43-f62e1f7e030e Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:55 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 50282ce3-1fdc-4fa5-8467-4cd4867a8395 Prelogin error: host localhost port 32783 Unexpected end of prelogin response after 0 bytes read Jun 13, 2023 8:49:56 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 837d01bc-7b0c-4532-88d2-1d91671d74f3 Prelogin error: host localhost port 32783 Error reading prelogin response: Connection reset ClientConnectionId:837d01bc-7b0c-4532-88d2-1d91671d74f3 Jun 13, 2023 8:49:57 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 43ed7181-5b5d-46e3-b7d2-f3cd2decb043 Prelogin error: host localhost port 32783 Error reading prelogin response: Connection reset ClientConnectionId:43ed7181-5b5d-46e3-b7d2-f3cd2decb043 Jun 13, 2023 8:49:58 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: f5a54844-ef86-4675-9b39-ede75733686b Prelogin error: host localhost port 32783 Error reading prelogin response: Connection reset ClientConnectionId:f5a54844-ef86-4675-9b39-ede75733686b Jun 13, 2023 8:49:59 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: 82da197b-0c48-4cb1-9a0b-e5dbfa27c616 Prelogin error: host localhost port 32783 Error reading prelogin response: Connection reset ClientConnectionId:82da197b-0c48-4cb1-9a0b-e5dbfa27c616 Jun 13, 2023 8:50:00 AM com.microsoft.sqlserver.jdbc.SQLServerConnection Prelogin WARNING: ConnectionID:1 ClientConnectionId: df9a8372-08cb-4686-85de-2a5ca5aabe52 Prelogin error: host localhost port 32783 Error reading prelogin response: Connection reset ClientConnectionId:df9a8372-08cb-4686-85de-2a5ca5aabe52 {code} https://github.com/apache/flink-connector-jdbc/actions/runs/5253247136/jobs/9490321045#step:13:322 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32313) CrateDB relies on flink-shaded in flink-connector-jdbc
Martijn Visser created FLINK-32313: -- Summary: CrateDB relies on flink-shaded in flink-connector-jdbc Key: FLINK-32313 URL: https://issues.apache.org/jira/browse/FLINK-32313 Project: Flink Issue Type: Bug Components: Connectors / JDBC Reporter: Martijn Visser See https://github.com/apache/flink-connector-jdbc/blob/main/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/databases/cratedb/catalog/CrateDBCatalog.java#L27 - JDBC shouldn't rely on flink-shaded. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32306) Multiple batch scheduler performance regressions
Martijn Visser created FLINK-32306: -- Summary: Multiple batch scheduler performance regressions Key: FLINK-32306 URL: https://issues.apache.org/jira/browse/FLINK-32306 Project: Flink Issue Type: Bug Reporter: Martijn Visser InitScheduling.BATCH http://codespeed.dak8s.net:8000/timeline/#/?exe=5=initSchedulingStrategy.BATCH=on=on=off=2=200 schedulingDownstreamTasks.BATCH http://codespeed.dak8s.net:8000/timeline/#/?exe=5=schedulingDownstreamTasks.BATCH=on=on=off=2=200 startScheduling.BATCH http://codespeed.dak8s.net:8000/timeline/#/?exe=5=startScheduling.BATCH=on=on=off=2=200 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32094) startScheduling.BATCH performance regression since May 11th
Martijn Visser created FLINK-32094: -- Summary: startScheduling.BATCH performance regression since May 11th Key: FLINK-32094 URL: https://issues.apache.org/jira/browse/FLINK-32094 Project: Flink Issue Type: Bug Components: Runtime / Coordination Reporter: Martijn Visser http://codespeed.dak8s.net:8000/timeline/#/?exe=5=startScheduling.BATCH=on=on=off=2=200 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32056) Update the used Pulsar connector in flink-python to 4.0.0
Martijn Visser created FLINK-32056: -- Summary: Update the used Pulsar connector in flink-python to 4.0.0 Key: FLINK-32056 URL: https://issues.apache.org/jira/browse/FLINK-32056 Project: Flink Issue Type: Bug Components: API / Python, Connectors / Pulsar Affects Versions: 1.18.0, 1.17.1 Reporter: Martijn Visser Assignee: Martijn Visser flink-python still references and tests flink-connector-pulsar:3.0.0, while it should be using flink-connector-pulsar:4.0.0. That's because the newer version is the only version compatible with Flink 1.17 and it doesn't rely on flink-shaded. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-32032) Upgrade to flink-shaded 17.0
Martijn Visser created FLINK-32032: -- Summary: Upgrade to flink-shaded 17.0 Key: FLINK-32032 URL: https://issues.apache.org/jira/browse/FLINK-32032 Project: Flink Issue Type: Technical Debt Reporter: Martijn Visser Assignee: Martijn Visser Fix For: 1.18.0 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31962) libssl not found when running CI
Martijn Visser created FLINK-31962: -- Summary: libssl not found when running CI Key: FLINK-31962 URL: https://issues.apache.org/jira/browse/FLINK-31962 Project: Flink Issue Type: Bug Components: Build System Affects Versions: 1.16.2, 1.18.0, 1.17.1 Reporter: Martijn Visser Assignee: Martijn Visser {code:java} Installed Maven 3.2.5 to /home/vsts/maven_cache/apache-maven-3.2.5 Installing required software Reading package lists... Building dependency tree... Reading state information... bc is already the newest version (1.07.1-2build1). bc set to manually installed. libapr1 is already the newest version (1.6.5-1ubuntu1). libapr1 set to manually installed. 0 upgraded, 0 newly installed, 0 to remove and 13 not upgraded. --2023-04-27 11:42:53-- http://security.ubuntu.com/ubuntu/pool/main/o/openssl1.0/libssl1.0.0_1.0.2n-1ubuntu5.11_amd64.deb Resolving security.ubuntu.com (security.ubuntu.com)... 91.189.91.39, 185.125.190.36, 185.125.190.39, ... Connecting to security.ubuntu.com (security.ubuntu.com)|91.189.91.39|:80... connected. HTTP request sent, awaiting response... 404 Not Found 2023-04-27 11:42:53 ERROR 404: Not Found. {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31925) Sync benchmark dependency versions with the version used in Flink
Martijn Visser created FLINK-31925: -- Summary: Sync benchmark dependency versions with the version used in Flink Key: FLINK-31925 URL: https://issues.apache.org/jira/browse/FLINK-31925 Project: Flink Issue Type: Technical Debt Components: Benchmarks Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31923) Connector weekly runs are only testing main branches instead of all supported branches
Martijn Visser created FLINK-31923: -- Summary: Connector weekly runs are only testing main branches instead of all supported branches Key: FLINK-31923 URL: https://issues.apache.org/jira/browse/FLINK-31923 Project: Flink Issue Type: Bug Components: Build System, Connectors / Common Reporter: Martijn Visser Assignee: Martijn Visser We have a weekly scheduled build for connectors. That's only triggered for the {{main}} branches, because that's how the Github Actions {{schedule}} works, per https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#schedule We can resolve that by having the Github Action flow checkout multiple branches as a matrix to run these weekly tests. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31770) OracleExactlyOnceSinkE2eTest.testInsert fails for JDBC connector
Martijn Visser created FLINK-31770: -- Summary: OracleExactlyOnceSinkE2eTest.testInsert fails for JDBC connector Key: FLINK-31770 URL: https://issues.apache.org/jira/browse/FLINK-31770 Project: Flink Issue Type: Bug Components: Connectors / JDBC Affects Versions: jdbc-3.1.0 Reporter: Martijn Visser {code:java} Caused by: org.apache.flink.util.FlinkRuntimeException: unable to start XA transaction, xid: 201:cea0dbd44c6403283f4050f627bed37c0200:e0070697, error -3: resource manager error has occurred. [XAErr (-3): A resource manager error has occured in the transaction branch. ORA-2045 SQLErr (0)] at org.apache.flink.connector.jdbc.xa.XaFacadeImpl.wrapException(XaFacadeImpl.java:369) at org.apache.flink.connector.jdbc.xa.XaFacadeImpl.access$800(XaFacadeImpl.java:67) at org.apache.flink.connector.jdbc.xa.XaFacadeImpl$Command.lambda$fromRunnable$0(XaFacadeImpl.java:301) at org.apache.flink.connector.jdbc.xa.XaFacadeImpl$Command.lambda$fromRunnable$4(XaFacadeImpl.java:340) at org.apache.flink.connector.jdbc.xa.XaFacadeImpl.execute(XaFacadeImpl.java:280) at org.apache.flink.connector.jdbc.xa.XaFacadeImpl.start(XaFacadeImpl.java:170) at org.apache.flink.connector.jdbc.xa.XaFacadePoolingImpl.start(XaFacadePoolingImpl.java:84) at org.apache.flink.connector.jdbc.xa.JdbcXaSinkFunction.beginTx(JdbcXaSinkFunction.java:316) at org.apache.flink.connector.jdbc.xa.JdbcXaSinkFunction.open(JdbcXaSinkFunction.java:241) at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34) at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:100) at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46) at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.initializeStateAndOpenOperators(RegularOperatorChain.java:107) at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:731) at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.call(StreamTaskActionExecutor.java:100) at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreInternal(StreamTask.java:706) at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:672) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:935) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:904) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:728) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:550) at java.lang.Thread.run(Thread.java:750) {code} https://github.com/apache/flink-connector-jdbc/actions/runs/4647776511/jobs/8224977183#step:13:325 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31745) Performance regression on serializerHeavyString since April 3rd
Martijn Visser created FLINK-31745: -- Summary: Performance regression on serializerHeavyString since April 3rd Key: FLINK-31745 URL: https://issues.apache.org/jira/browse/FLINK-31745 Project: Flink Issue Type: Bug Reporter: Martijn Visser Fix For: 1.18.0 serializerHeavyString baseline=241.682406 current_value=203.24132 http://codespeed.dak8s.net:8000/timeline/#/?exe=1=serializerHeavyString=on=on=off=2=200 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31731) No suitable constructor found for DebeziumAvroSerializationSchema
Martijn Visser created FLINK-31731: -- Summary: No suitable constructor found for DebeziumAvroSerializationSchema Key: FLINK-31731 URL: https://issues.apache.org/jira/browse/FLINK-31731 Project: Flink Issue Type: Bug Components: Connectors / Kafka Affects Versions: 1.18.0, kafka-4.0.0 Reporter: Martijn Visser {code:java} Error: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile (default-testCompile) on project flink-connector-kafka: Compilation failure Error: /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/table/KafkaDynamicTableFactoryTest.java:[939,16] no suitable constructor found for DebeziumAvroSerializationSchema(org.apache.flink.table.types.logical.RowType,java.lang.String,java.lang.String,) Error: constructor org.apache.flink.formats.avro.registry.confluent.debezium.DebeziumAvroSerializationSchema.DebeziumAvroSerializationSchema(org.apache.flink.table.types.logical.RowType,java.lang.String,java.lang.String,java.lang.String,java.util.Map) is not applicable Error:(actual and formal argument lists differ in length) Error: constructor org.apache.flink.formats.avro.registry.confluent.debezium.DebeziumAvroSerializationSchema.DebeziumAvroSerializationSchema(org.apache.flink.formats.avro.AvroRowDataSerializationSchema) is not applicable Error:(actual and formal argument lists differ in length) Error: -> [Help 1] Error: Error: To see the full stack trace of the errors, re-run Maven with the -e switch. Error: Re-run Maven using the -X switch to enable full debug logging. Error: Error: For more information about the errors and possible solutions, please read the following articles: Error: [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException Error: Error: After correcting the problems, you can resume the build with the command Error:mvn -rf :flink-connector-kafka Error: Process completed with exit code 1. {code} https://github.com/apache/flink-connector-kafka/actions/runs/4610715024/jobs/8149513647#step:13:153 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31725) Synchronize dependency version between Flink and flink-connector-pulsar
Martijn Visser created FLINK-31725: -- Summary: Synchronize dependency version between Flink and flink-connector-pulsar Key: FLINK-31725 URL: https://issues.apache.org/jira/browse/FLINK-31725 Project: Flink Issue Type: Technical Debt Components: Connectors / Pulsar Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31722) Remove dependency on flink-shaded
Martijn Visser created FLINK-31722: -- Summary: Remove dependency on flink-shaded Key: FLINK-31722 URL: https://issues.apache.org/jira/browse/FLINK-31722 Project: Flink Issue Type: Technical Debt Components: Connectors / Cassandra Reporter: Martijn Visser The Cassandra connector relies on flink-shaded and uses Flinks' shaded Guava. With the externalization of connector, connectors shouldn't rely on Flink-Shaded but instead shade dependencies such as this one themselves -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31719) Update Netty to 4.1.91-Final
Martijn Visser created FLINK-31719: -- Summary: Update Netty to 4.1.91-Final Key: FLINK-31719 URL: https://issues.apache.org/jira/browse/FLINK-31719 Project: Flink Issue Type: Technical Debt Components: BuildSystem / Shaded Reporter: Martijn Visser Assignee: Martijn Visser This is a bug fix release, which contains an important fix for Netty's native SSL implementation that fixes a problem that could lead to problems in state-machines. https://netty.io/news/2023/04/03/4-1-91-Final.html -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31712) Allow skipping of archunit tests for nightly connector builds
Martijn Visser created FLINK-31712: -- Summary: Allow skipping of archunit tests for nightly connector builds Key: FLINK-31712 URL: https://issues.apache.org/jira/browse/FLINK-31712 Project: Flink Issue Type: Improvement Components: Build System, Connectors / Common Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31705) Remove Conjars
Martijn Visser created FLINK-31705: -- Summary: Remove Conjars Key: FLINK-31705 URL: https://issues.apache.org/jira/browse/FLINK-31705 Project: Flink Issue Type: Technical Debt Components: Build System Reporter: Martijn Visser Assignee: Martijn Visser With Conjars no longer being available (only https://conjars.wensel.net/ is there), we should remove all the notices to Conjars in Flink. We've already removed the need for Conjars because we've excluded Pentaho as part of FLINK-27640, which eliminates having any dependency that relies on Conjars. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31694) Bump ua-parser-js from 0.7.31 to 0.7.33
Martijn Visser created FLINK-31694: -- Summary: Bump ua-parser-js from 0.7.31 to 0.7.33 Key: FLINK-31694 URL: https://issues.apache.org/jira/browse/FLINK-31694 Project: Flink Issue Type: Technical Debt Components: Runtime / Web Frontend Reporter: Martijn Visser Assignee: Martijn Visser Dependabot PR: https://github.com/apache/flink/pull/21767 -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31693) Bump http-cache-semantics from 4.1.0 to 4.1.1 in
Martijn Visser created FLINK-31693: -- Summary: Bump http-cache-semantics from 4.1.0 to 4.1.1 in Key: FLINK-31693 URL: https://issues.apache.org/jira/browse/FLINK-31693 Project: Flink Issue Type: Technical Debt Components: Runtime / Web Frontend Reporter: Martijn Visser Assignee: Martijn Visser -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31676) Pulsar connector should not rely on Flink Shaded
Martijn Visser created FLINK-31676: -- Summary: Pulsar connector should not rely on Flink Shaded Key: FLINK-31676 URL: https://issues.apache.org/jira/browse/FLINK-31676 Project: Flink Issue Type: Technical Debt Components: Connectors / Pulsar Reporter: Martijn Visser The Pulsar connector currently depends on Flink Shaded for Guava. However, externalized connectors must not rely on flink-shaded. This will just not be possible if we want them to work against different Flink versions. -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31485) Connecting to Kafka and Avro Schema Registry fails with ClassNotFoundException
Martijn Visser created FLINK-31485: -- Summary: Connecting to Kafka and Avro Schema Registry fails with ClassNotFoundException Key: FLINK-31485 URL: https://issues.apache.org/jira/browse/FLINK-31485 Project: Flink Issue Type: Bug Components: Connectors / Kafka Affects Versions: 1.17.0 Reporter: Martijn Visser When running the SQL Client and using flink-sql-connector-kafka, flink-sql-avro and flink-sql-avro-confluent-registry and trying to query Schema Registry, the job will fail with {code:bash} [ERROR] Could not execute SQL statement. Reason: java.lang.ClassNotFoundException: com.google.common.base.Ticker {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)
[jira] [Created] (FLINK-31446) KafkaSinkITCase$IntegrationTests.testMetrics failed because topic XXX already exists
Martijn Visser created FLINK-31446: -- Summary: KafkaSinkITCase$IntegrationTests.testMetrics failed because topic XXX already exists Key: FLINK-31446 URL: https://issues.apache.org/jira/browse/FLINK-31446 Project: Flink Issue Type: Bug Components: Connectors / Kafka Reporter: Martijn Visser {code:java} Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) Mar 14 02:07:46 at org.junit.platform.engine.support.hierarchical.ForkJoinPoolHierarchicalTestExecutorService$ExclusiveTask.compute(ForkJoinPoolHierarchicalTestExecutorService.java:185) Mar 14 02:07:46 at java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189) Mar 14 02:07:46 at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) Mar 14 02:07:46 at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) Mar 14 02:07:46 at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) Mar 14 02:07:46 at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175) Mar 14 02:07:46 Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TopicExistsException: Topic 'kafka-single-topic-1095096269466403022' already exists. Mar 14 02:07:46 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357) Mar 14 02:07:46 at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908) Mar 14 02:07:46 at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:165) Mar 14 02:07:46 at org.apache.flink.connector.kafka.sink.testutils.KafkaSinkExternalContext.createTopic(KafkaSinkExternalContext.java:101) Mar 14 02:07:46 ... 110 more Mar 14 02:07:46 Caused by: org.apache.kafka.common.errors.TopicExistsException: Topic 'kafka-single-topic-1095096269466403022' already exists. {code} https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=47127=logs=aa18c3f6-13b8-5f58-86bb-c1cffb239496=502fb6c0-30a2-5e49-c5c2-a00fa3acb203=36477 -- This message was sent by Atlassian Jira (v8.20.10#820010)