[jira] [Commented] (FLINK-25537) [JUnit5 Migration] Module: flink-core

2024-05-07 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17844226#comment-17844226
 ] 

Jiabao Sun commented on FLINK-25537:


master: ffa3869c48a68c1dd3126fa949adc6953979711f

> [JUnit5 Migration] Module: flink-core
> -
>
> Key: FLINK-25537
> URL: https://issues.apache.org/jira/browse/FLINK-25537
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Core
>Reporter: Qingsheng Ren
>Assignee: Aiden Gong
>Priority: Minor
>  Labels: pull-request-available, stale-assigned
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Closed] (FLINK-35245) Add metrics for flink-connector-tidb-cdc

2024-05-06 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun closed FLINK-35245.
--
Resolution: Implemented

Implemented via cdc-master: fa6e7ea51258dcd90f06036196618224156df367

> Add metrics for flink-connector-tidb-cdc
> 
>
> Key: FLINK-35245
> URL: https://issues.apache.org/jira/browse/FLINK-35245
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.2.0
>
>
> As [https://github.com/apache/flink-cdc/issues/985] had been closed, but it 
> has not been resolved.
> Create  a new issue to track this issue



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-35245) Add metrics for flink-connector-tidb-cdc

2024-05-06 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-35245:
---
Fix Version/s: cdc-3.2.0

> Add metrics for flink-connector-tidb-cdc
> 
>
> Key: FLINK-35245
> URL: https://issues.apache.org/jira/browse/FLINK-35245
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.2.0
>
>
> As [https://github.com/apache/flink-cdc/issues/985] had been closed, but it 
> has not been resolved.
> Create  a new issue to track this issue



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-35245) Add metrics for flink-connector-tidb-cdc

2024-05-06 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-35245:
--

Assignee: Xie Yi

> Add metrics for flink-connector-tidb-cdc
> 
>
> Key: FLINK-35245
> URL: https://issues.apache.org/jira/browse/FLINK-35245
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
>
> As [https://github.com/apache/flink-cdc/issues/985] had been closed, but it 
> has not been resolved.
> Create  a new issue to track this issue



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Closed] (FLINK-35274) Occasional failure issue with Flink CDC Db2 UT

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun closed FLINK-35274.
--

> Occasional failure issue with Flink CDC Db2 UT
> --
>
> Key: FLINK-35274
> URL: https://issues.apache.org/jira/browse/FLINK-35274
> Project: Flink
>  Issue Type: Bug
>Reporter: Xin Gong
>Assignee: Xin Gong
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.1.0
>
>
> Occasional failure issue with Flink CDC Db2 UT. Because db2 redolog data 
> tableId don't have database name, it will cause table schame occasional not 
> found when task exception restart. I will fix it by supplement database name.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-35274) Occasional failure issue with Flink CDC Db2 UT

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-35274.

Resolution: Fixed

Fixed via cdc
* master: a7cb46f7621568486a069a7ae01a7b86ebb0a801
* release-3.1: d556f29475a52234a98bcc65db959483a10beb52

> Occasional failure issue with Flink CDC Db2 UT
> --
>
> Key: FLINK-35274
> URL: https://issues.apache.org/jira/browse/FLINK-35274
> Project: Flink
>  Issue Type: Bug
>Reporter: Xin Gong
>Assignee: Xin Gong
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.1.0
>
>
> Occasional failure issue with Flink CDC Db2 UT. Because db2 redolog data 
> tableId don't have database name, it will cause table schame occasional not 
> found when task exception restart. I will fix it by supplement database name.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-35274) Occasional failure issue with Flink CDC Db2 UT

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-35274:
--

Assignee: Xin Gong

> Occasional failure issue with Flink CDC Db2 UT
> --
>
> Key: FLINK-35274
> URL: https://issues.apache.org/jira/browse/FLINK-35274
> Project: Flink
>  Issue Type: Bug
>Reporter: Xin Gong
>Assignee: Xin Gong
>Priority: Critical
>  Labels: pull-request-available
> Fix For: 3.1.0
>
>
> Occasional failure issue with Flink CDC Db2 UT. Because db2 redolog data 
> tableId don't have database name, it will cause table schame occasional not 
> found when task exception restart. I will fix it by supplement database name.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Closed] (FLINK-35244) Correct the package for flink-connector-tidb-cdc test

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun closed FLINK-35244.
--

>  Correct the package for flink-connector-tidb-cdc test
> --
>
> Key: FLINK-35244
> URL: https://issues.apache.org/jira/browse/FLINK-35244
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.2.0
>
> Attachments: image-2024-04-26-16-19-39-297.png
>
>
> test case for flink-connector-tidb-cdc should under
> *org.apache.flink.cdc.connectors.tidb* package
> instead of *org.apache.flink.cdc.connectors*
> !image-2024-04-26-16-19-39-297.png!
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-35244) Correct the package for flink-connector-tidb-cdc test

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-35244.

Fix Version/s: cdc-3.2.0
   Resolution: Fixed

Resolved via cdc-master: 002b16ed4e155b01374040ff302b7536d9c41245

>  Correct the package for flink-connector-tidb-cdc test
> --
>
> Key: FLINK-35244
> URL: https://issues.apache.org/jira/browse/FLINK-35244
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.2.0
>
> Attachments: image-2024-04-26-16-19-39-297.png
>
>
> test case for flink-connector-tidb-cdc should under
> *org.apache.flink.cdc.connectors.tidb* package
> instead of *org.apache.flink.cdc.connectors*
> !image-2024-04-26-16-19-39-297.png!
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-35244) Correct the package for flink-connector-tidb-cdc test

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-35244:
---
Summary:  Correct the package for flink-connector-tidb-cdc test  (was: Move 
package for flink-connector-tidb-cdc test)

>  Correct the package for flink-connector-tidb-cdc test
> --
>
> Key: FLINK-35244
> URL: https://issues.apache.org/jira/browse/FLINK-35244
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
> Attachments: image-2024-04-26-16-19-39-297.png
>
>
> test case for flink-connector-tidb-cdc should under
> *org.apache.flink.cdc.connectors.tidb* package
> instead of *org.apache.flink.cdc.connectors*
> !image-2024-04-26-16-19-39-297.png!
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-32843) [JUnit5 Migration] The jobmaster package of flink-runtime module

2024-05-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-32843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-32843.

Fix Version/s: 1.20.0
   Resolution: Fixed

Resolved via master: beb0b167bdcf95f27be87a214a69a174fd49d256

> [JUnit5 Migration] The jobmaster package of flink-runtime module
> 
>
> Key: FLINK-32843
> URL: https://issues.apache.org/jira/browse/FLINK-32843
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Rui Fan
>Assignee: RocMarshal
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-35244) Move package for flink-connector-tidb-cdc test

2024-04-26 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-35244:
--

Assignee: Xie Yi

> Move package for flink-connector-tidb-cdc test
> --
>
> Key: FLINK-35244
> URL: https://issues.apache.org/jira/browse/FLINK-35244
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Reporter: Xie Yi
>Assignee: Xie Yi
>Priority: Major
>  Labels: pull-request-available
> Attachments: image-2024-04-26-16-19-39-297.png
>
>
> test case for flink-connector-tidb-cdc should under
> *org.apache.flink.cdc.connectors.tidb* package
> instead of *org.apache.flink.cdc.connectors*
> !image-2024-04-26-16-19-39-297.png!
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-35235) Fix missing dependencies in the uber jar

2024-04-26 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35235?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-35235.

  Assignee: LvYanquan
Resolution: Fixed

Resolved via

* cdc master: ec643c9dd7365261f3cee620d4d6bd5d042917e0
* cdc release-3.1: b96ea11cc7df6c3d57a155573f29c18bf9d787ae

> Fix missing dependencies in the uber jar
> 
>
> Key: FLINK-35235
> URL: https://issues.apache.org/jira/browse/FLINK-35235
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Affects Versions: 3.1.0
>Reporter: LvYanquan
>Assignee: LvYanquan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.1.0
>
> Attachments: image-2024-04-25-15-17-20-987.png, 
> image-2024-04-25-15-17-34-717.png
>
>
> Some class of Kafka were not included in fat jar.
> !image-2024-04-25-15-17-34-717.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34738) "Deployment - YARN" Page for Flink CDC Chinese Documentation

2024-04-17 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34738:
--

Assignee: Vincent Woo

> "Deployment - YARN" Page for Flink CDC Chinese Documentation
> 
>
> Key: FLINK-34738
> URL: https://issues.apache.org/jira/browse/FLINK-34738
> Project: Flink
>  Issue Type: Sub-task
>  Components: chinese-translation, Documentation, Flink CDC
>Affects Versions: cdc-3.1.0
>Reporter: LvYanquan
>Assignee: Vincent Woo
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>
> Translate 
> [https://github.com/apache/flink-cdc/blob/master/docs/content/docs/deployment/yarn.md]
>  into Chinese.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35139) Release flink-connector-mongodb vX.X.X for Flink 1.19

2024-04-17 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35139?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17838077#comment-17838077
 ] 

Jiabao Sun commented on FLINK-35139:


mongodb main: 660ffe4f33f3ce60da139159741644f48295652d

> Release flink-connector-mongodb vX.X.X for Flink 1.19
> -
>
> Key: FLINK-35139
> URL: https://issues.apache.org/jira/browse/FLINK-35139
> Project: Flink
>  Issue Type: Sub-task
>  Components: Connectors / MongoDB
>Reporter: Danny Cranmer
>Assignee: Danny Cranmer
>Priority: Major
>  Labels: pull-request-available
> Fix For: mongodb-1.2.0
>
>
> https://github.com/apache/flink-connector-mongodb



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-35079) MongoConnector failed to resume token when current collection removed

2024-04-16 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-35079.

Fix Version/s: cdc-3.1.0
   Resolution: Fixed

resolved via cdc master: 0562e35da75fb2c8e512d438adb8f80a87964dc4

> MongoConnector failed to resume token when current collection removed
> -
>
> Key: FLINK-35079
> URL: https://issues.apache.org/jira/browse/FLINK-35079
> Project: Flink
>  Issue Type: Bug
>  Components: Flink CDC
>Reporter: Xiqian YU
>Assignee: Xiqian YU
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>
> When connector tries to create cursor with an expired resuming token during 
> stream task fetching stage, MongoDB connector will crash with such message: 
> "error due to Command failed with error 280 (ChangeStreamFatalError): 'cannot 
> resume stream; the resume token was not found."



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35127) CDC ValuesDataSourceITCase crashed due to OutOfMemoryError

2024-04-16 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35127?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17837929#comment-17837929
 ] 

Jiabao Sun commented on FLINK-35127:


Hi [~kunni],
Could you help take a look?

> CDC ValuesDataSourceITCase crashed due to OutOfMemoryError
> --
>
> Key: FLINK-35127
> URL: https://issues.apache.org/jira/browse/FLINK-35127
> Project: Flink
>  Issue Type: Bug
>  Components: Flink CDC
>Reporter: Jiabao Sun
>Priority: Major
>  Labels: test-stability
> Fix For: cdc-3.1.0
>
>
> {code}
> [INFO] Running 
> org.apache.flink.cdc.connectors.values.source.ValuesDataSourceITCase
> Error: Exception in thread "surefire-forkedjvm-command-thread" 
> java.lang.OutOfMemoryError: Java heap space
> Error:  
> Error:  Exception: java.lang.OutOfMemoryError thrown from the 
> UncaughtExceptionHandler in thread "taskmanager_4-main-scheduler-thread-2"
> Error:  
> Error:  Exception: java.lang.OutOfMemoryError thrown from the 
> UncaughtExceptionHandler in thread "System Time Trigger for Source: values 
> (1/4)#0"
> {code}
> https://github.com/apache/flink-cdc/actions/runs/8698450229/job/23858750352?pr=3221#step:6:1949



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-35127) CDC ValuesDataSourceITCase crashed due to OutOfMemoryError

2024-04-16 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-35127:
--

 Summary: CDC ValuesDataSourceITCase crashed due to OutOfMemoryError
 Key: FLINK-35127
 URL: https://issues.apache.org/jira/browse/FLINK-35127
 Project: Flink
  Issue Type: Bug
  Components: Flink CDC
Reporter: Jiabao Sun
 Fix For: cdc-3.1.0


{code}
[INFO] Running 
org.apache.flink.cdc.connectors.values.source.ValuesDataSourceITCase
Error: Exception in thread "surefire-forkedjvm-command-thread" 
java.lang.OutOfMemoryError: Java heap space
Error:  
Error:  Exception: java.lang.OutOfMemoryError thrown from the 
UncaughtExceptionHandler in thread "taskmanager_4-main-scheduler-thread-2"
Error:  
Error:  Exception: java.lang.OutOfMemoryError thrown from the 
UncaughtExceptionHandler in thread "System Time Trigger for Source: values 
(1/4)#0"
{code}

https://github.com/apache/flink-cdc/actions/runs/8698450229/job/23858750352?pr=3221#step:6:1949




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25537) [JUnit5 Migration] Module: flink-core

2024-04-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17837471#comment-17837471
 ] 

Jiabao Sun commented on FLINK-25537:


master: 138f1f9d17b7a58b092ee7d9fc4c20d968a7b33b

> [JUnit5 Migration] Module: flink-core
> -
>
> Key: FLINK-25537
> URL: https://issues.apache.org/jira/browse/FLINK-25537
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Core
>Reporter: Qingsheng Ren
>Assignee: Aiden Gong
>Priority: Minor
>  Labels: pull-request-available, stale-assigned
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Closed] (FLINK-35010) Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for Flink Mongodb connector

2024-04-09 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun closed FLINK-35010.
--

> Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for Flink 
> Mongodb connector
> --
>
> Key: FLINK-35010
> URL: https://issues.apache.org/jira/browse/FLINK-35010
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
> Fix For: mongodb-1.2.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-35010) Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for Flink Mongodb connector

2024-04-09 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-35010.

Fix Version/s: mongodb-1.2.0
   Resolution: Fixed

Fixed via mongodb-connector main: ee1146dadf73e91ecb7a2b28cfa879e7fe3b3f22

> Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for Flink 
> Mongodb connector
> --
>
> Key: FLINK-35010
> URL: https://issues.apache.org/jira/browse/FLINK-35010
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
> Fix For: mongodb-1.2.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-35010) Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for Flink Mongodb connector

2024-04-09 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-35010:
---
Summary: Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for 
Flink Mongodb connector  (was: Bump org.apache.commons:commons-compress from 
1.24.0 to 1.26.0 for Flink Mongodb connector)

> Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.1 for Flink 
> Mongodb connector
> --
>
> Key: FLINK-35010
> URL: https://issues.apache.org/jira/browse/FLINK-35010
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35008) Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink Kafka connector

2024-04-08 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17835156#comment-17835156
 ] 

Jiabao Sun commented on FLINK-35008:


I agree with Sergey's opinion. 

In version 1.26.0, the dependency of commons-codec is optional and the 
dependency error of COMPRESS-659 cause CI failure. 
To avoid this error, we have to explicitly add the dependency of commons-codec. 

Although the issue of COMPRESS-659 import error has been fixed in version 
1.26.1, the dependency of commons-codec has changed to a non-optional 
transitive dependency which is not necessary.
Using version 1.26.1, we don't need to explicitly declare the dependency of 
commons-codec, which may be better than using version 1.26.0.

> Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink 
> Kafka connector
> 
>
> Key: FLINK-35008
> URL: https://issues.apache.org/jira/browse/FLINK-35008
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / Kafka
>Reporter: Martijn Visser
>Assignee: Martijn Visser
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34921) SystemProcessingTimeServiceTest fails due to missing output

2024-04-08 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34921?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834864#comment-17834864
 ] 

Jiabao Sun commented on FLINK-34921:


Maybe we shouldn't use ScheduledFuture.get() to check the scheduled task is 
completed.

https://stackoverflow.com/questions/28116301/scheduledfuture-get-is-still-blocked-after-executor-shutdown

> SystemProcessingTimeServiceTest fails due to missing output
> ---
>
> Key: FLINK-34921
> URL: https://issues.apache.org/jira/browse/FLINK-34921
> Project: Flink
>  Issue Type: Bug
>  Components: API / DataStream
>Affects Versions: 1.20.0
>Reporter: Matthias Pohl
>Priority: Critical
>  Labels: test-stability
>
> This PR CI build with {{AdaptiveScheduler}} enabled failed:
> https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58476=logs=0da23115-68bb-5dcd-192c-bd4c8adebde1=24c3384f-1bcb-57b3-224f-51bf973bbee8=11224
> {code}
> "ForkJoinPool-61-worker-25" #863 daemon prio=5 os_prio=0 
> tid=0x7f8c19eba000 nid=0x60a5 waiting on condition [0x7f8bc2cf9000]
> Mar 21 17:19:42java.lang.Thread.State: WAITING (parking)
> Mar 21 17:19:42   at sun.misc.Unsafe.park(Native Method)
> Mar 21 17:19:42   - parking to wait for  <0xd81959b8> (a 
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask)
> Mar 21 17:19:42   at 
> java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
> Mar 21 17:19:42   at 
> java.util.concurrent.FutureTask.awaitDone(FutureTask.java:429)
> Mar 21 17:19:42   at 
> java.util.concurrent.FutureTask.get(FutureTask.java:191)
> Mar 21 17:19:42   at 
> org.apache.flink.streaming.runtime.tasks.SystemProcessingTimeServiceTest$$Lambda$1443/1477662666.call(Unknown
>  Source)
> Mar 21 17:19:42   at 
> org.assertj.core.api.ThrowableAssert.catchThrowable(ThrowableAssert.java:63)
> Mar 21 17:19:42   at 
> org.assertj.core.api.AssertionsForClassTypes.catchThrowable(AssertionsForClassTypes.java:892)
> Mar 21 17:19:42   at 
> org.assertj.core.api.Assertions.catchThrowable(Assertions.java:1366)
> Mar 21 17:19:42   at 
> org.assertj.core.api.Assertions.assertThatThrownBy(Assertions.java:1210)
> Mar 21 17:19:42   at 
> org.apache.flink.streaming.runtime.tasks.SystemProcessingTimeServiceTest.testQuiesceAndAwaitingCancelsScheduledAtFixRateFuture(SystemProcessingTimeServiceTest.java:92)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34955) Upgrade commons-compress to 1.26.0

2024-04-07 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34955?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834779#comment-17834779
 ] 

Jiabao Sun commented on FLINK-34955:


I have rechecked the dependency of `commons-codec` in `commons-compress` and it 
is no longer optional. Even if upgraded to 1.26.1, `commons-codec` will still 
be a transitive dependency. 
Sorry for the disturbance.

> Upgrade commons-compress to 1.26.0
> --
>
> Key: FLINK-34955
> URL: https://issues.apache.org/jira/browse/FLINK-34955
> Project: Flink
>  Issue Type: Improvement
>Reporter: Shilun Fan
>Assignee: Shilun Fan
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.18.2, 1.20.0, 1.19.1
>
>
> commons-compress 1.24.0 has CVE issues, try to upgrade to 1.26.0, we can 
> refer to the maven link
> https://mvnrepository.com/artifact/org.apache.commons/commons-compress



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35010) Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.0 for Flink Mongodb connector

2024-04-07 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834778#comment-17834778
 ] 

Jiabao Sun commented on FLINK-35010:


I have rechecked the dependency of `commons-codec` in `commons-compress` and it 
is no longer optional. 
Even if upgraded to 1.26.1, `commons-codec` will still be a transitive 
dependency. 
Please ignore the previous noise, sorry for the disturbance.

> Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.0 for Flink 
> Mongodb connector
> --
>
> Key: FLINK-35010
> URL: https://issues.apache.org/jira/browse/FLINK-35010
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35008) Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink Kafka connector

2024-04-07 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834777#comment-17834777
 ] 

Jiabao Sun commented on FLINK-35008:


I have rechecked the dependency of `commons-codec` in `commons-compress` and it 
is no longer optional. 
Even if upgraded to 1.26.1, `commons-codec` will still be a transitive 
dependency. 
Please ignore the previous noise, sorry for the disturbance.

> Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink 
> Kafka connector
> 
>
> Key: FLINK-35008
> URL: https://issues.apache.org/jira/browse/FLINK-35008
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / Kafka
>Reporter: Martijn Visser
>Assignee: Martijn Visser
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-35010) Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.0 for Flink Mongodb connector

2024-04-06 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-35010:
--

Assignee: Zhongqiang Gong

> Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.0 for Flink 
> Mongodb connector
> --
>
> Key: FLINK-35010
> URL: https://issues.apache.org/jira/browse/FLINK-35010
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35010) Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.0 for Flink Mongodb connector

2024-04-06 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834607#comment-17834607
 ] 

Jiabao Sun commented on FLINK-35010:


I think we should bump commons-compress version to 1.26.1 due to 
https://issues.apache.org/jira/browse/COMPRESS-659.

> Bump org.apache.commons:commons-compress from 1.24.0 to 1.26.0 for Flink 
> Mongodb connector
> --
>
> Key: FLINK-35010
> URL: https://issues.apache.org/jira/browse/FLINK-35010
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35008) Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink Kafka connector

2024-04-06 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834606#comment-17834606
 ] 

Jiabao Sun commented on FLINK-35008:


Due to an incorrect dependency on the Charsets class in the commons-codec 
package in TarArchiveOutputStream, it is necessary to include the commons-codec 
dependency to avoid compilation errors. 
This issue has been fixed in version 1.26.1 of commons-compress.

https://github.com/GOODBOY008/flink-connector-mongodb/actions/runs/8557577952/job/23450146047#step:15:11104
{code}
Caused by: java.lang.RuntimeException: Failed to build JobManager image
at 
org.apache.flink.connector.testframe.container.FlinkTestcontainersConfigurator.configureJobManagerContainer(FlinkTestcontainersConfigurator.java:67)
at 
org.apache.flink.connector.testframe.container.FlinkTestcontainersConfigurator.configure(FlinkTestcontainersConfigurator.java:147)
at 
org.apache.flink.connector.testframe.container.FlinkContainers$Builder.build(FlinkContainers.java:197)
at 
org.apache.flink.tests.util.mongodb.MongoE2ECase.(MongoE2ECase.java:90)
... 56 more
Caused by: org.apache.flink.connector.testframe.container.ImageBuildException: 
Failed to build image "flink-configured-jobmanager"
at 
org.apache.flink.connector.testframe.container.FlinkImageBuilder.build(FlinkImageBuilder.java:234)
at 
org.apache.flink.connector.testframe.container.FlinkTestcontainersConfigurator.configureJobManagerContainer(FlinkTestcontainersConfigurator.java:65)
... 59 more
Caused by: java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
org/apache/commons/codec/Charsets
at org.rnorth.ducttape.timeouts.Timeouts.callFuture(Timeouts.java:68)
at 
org.rnorth.ducttape.timeouts.Timeouts.getWithTimeout(Timeouts.java:43)
at org.testcontainers.utility.LazyFuture.get(LazyFuture.java:45)
at 
org.apache.flink.connector.testframe.container.FlinkImageBuilder.buildBaseImage(FlinkImageBuilder.java:255)
at 
org.apache.flink.connector.testframe.container.FlinkImageBuilder.build(FlinkImageBuilder.java:206)
... 60 more
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/codec/Charsets
at 
org.apache.commons.compress.archivers.tar.TarArchiveOutputStream.(TarArchiveOutputStream.java:212)
at 
org.apache.commons.compress.archivers.tar.TarArchiveOutputStream.(TarArchiveOutputStream.java:157)
at 
org.apache.commons.compress.archivers.tar.TarArchiveOutputStream.(TarArchiveOutputStream.java:147)
at 
org.testcontainers.images.builder.ImageFromDockerfile.resolve(ImageFromDockerfile.java:129)
at 
org.testcontainers.images.builder.ImageFromDockerfile.resolve(ImageFromDockerfile.java:40)
at 
org.testcontainers.utility.LazyFuture.getResolvedValue(LazyFuture.java:17)
at org.testcontainers.utility.LazyFuture.get(LazyFuture.java:39)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.codec.Charsets
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 11 more
{code}


> Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink 
> Kafka connector
> 
>
> Key: FLINK-35008
> URL: https://issues.apache.org/jira/browse/FLINK-35008
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / Kafka
>Reporter: Martijn Visser
>Assignee: Martijn Visser
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-35008) Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink Kafka connector

2024-04-06 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-35008?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834604#comment-17834604
 ] 

Jiabao Sun commented on FLINK-35008:


Hi [~martijnvisser], maybe we should bump the commons-compress version to 
1.26.1.
In version 1.26.0, there should not be a dependency on commons-codec.

see: https://issues.apache.org/jira/browse/COMPRESS-659

> Bump org.apache.commons:commons-compress from 1.25.0 to 1.26.0 for Flink 
> Kafka connector
> 
>
> Key: FLINK-35008
> URL: https://issues.apache.org/jira/browse/FLINK-35008
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / Kafka
>Reporter: Martijn Visser
>Assignee: Martijn Visser
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34405) RightOuterJoinTaskTest#testCancelOuterJoinTaskWhileSort2 fails due to an interruption of the RightOuterJoinDriver#prepare method

2024-04-06 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17834591#comment-17834591
 ] 

Jiabao Sun commented on FLINK-34405:


taskRunner Thread:  testDriver() -> AbstractOuterJoinDriver#prepare() :101 -> 
WAITING on ExternalSorter#getIterator().

The InterruptedException is always thrown by BinaryOperatorTestBase:209.
It will be dropped after cancel() method called, see BinaryOperatorTestBase:260.

> RightOuterJoinTaskTest#testCancelOuterJoinTaskWhileSort2 fails due to an 
> interruption of the RightOuterJoinDriver#prepare method
> 
>
> Key: FLINK-34405
> URL: https://issues.apache.org/jira/browse/FLINK-34405
> Project: Flink
>  Issue Type: Bug
>  Components: API / Core
>Affects Versions: 1.17.2, 1.19.0, 1.18.1, 1.20.0
>Reporter: Matthias Pohl
>Priority: Critical
>  Labels: starter, test-stability
>
> https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=57357=logs=d89de3df-4600-5585-dadc-9bbc9a5e661c=be5a4b15-4b23-56b1-7582-795f58a645a2=9027
> {code}
> Feb 07 03:20:16 03:20:16.223 [ERROR] Failures: 
> Feb 07 03:20:16 03:20:16.223 [ERROR] 
> org.apache.flink.runtime.operators.RightOuterJoinTaskTest.testCancelOuterJoinTaskWhileSort2
> Feb 07 03:20:16 03:20:16.223 [ERROR]   Run 1: 
> RightOuterJoinTaskTest>AbstractOuterJoinTaskTest.testCancelOuterJoinTaskWhileSort2:435
>  
> Feb 07 03:20:16 expected: 
> Feb 07 03:20:16   null
> Feb 07 03:20:16  but was: 
> Feb 07 03:20:16   java.lang.Exception: The data preparation caused an error: 
> Interrupted
> Feb 07 03:20:16   at 
> org.apache.flink.runtime.operators.testutils.BinaryOperatorTestBase.testDriverInternal(BinaryOperatorTestBase.java:209)
> Feb 07 03:20:16   at 
> org.apache.flink.runtime.operators.testutils.BinaryOperatorTestBase.testDriver(BinaryOperatorTestBase.java:189)
> Feb 07 03:20:16   at 
> org.apache.flink.runtime.operators.AbstractOuterJoinTaskTest.access$100(AbstractOuterJoinTaskTest.java:48)
> Feb 07 03:20:16   ...(1 remaining lines not displayed - this can be 
> changed with Assertions.setMaxStackTraceElementsDisplayed)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-35011) The change in visibility of MockDeserializationSchema cause compilation failure in kafka connector

2024-04-06 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-35011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-35011.

Resolution: Fixed

Fixed via master: 3590c2d86f4186771ffcd64712f756d31306eb88

> The change in visibility of MockDeserializationSchema cause compilation 
> failure in kafka connector
> --
>
> Key: FLINK-35011
> URL: https://issues.apache.org/jira/browse/FLINK-35011
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Affects Versions: 1.20.0
>Reporter: Jiabao Sun
>Assignee: Jiabao Sun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>
> Flink Kafka connector can't compile with 1.20-SNAPSHOT, see 
> https://github.com/apache/flink-connector-kafka/actions/runs/8553981349/job/23438292087?pr=90#step:15:165
> Error message is:
> {code}
> Error:  Failed to execute goal 
> org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile 
> (default-testCompile) on project flink-connector-kafka: Compilation failure
> Error:  
> /home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBaseTest.java:[60,39]
>  org.apache.flink.streaming.util.MockDeserializationSchema is not public in 
> org.apache.flink.streaming.util; cannot be accessed from outside package
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-35011) The change in visibility of MockDeserializationSchema cause compilation failure in kafka connector

2024-04-04 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-35011:
--

 Summary: The change in visibility of MockDeserializationSchema 
cause compilation failure in kafka connector
 Key: FLINK-35011
 URL: https://issues.apache.org/jira/browse/FLINK-35011
 Project: Flink
  Issue Type: Bug
  Components: Tests
Affects Versions: 1.20.0
Reporter: Jiabao Sun
Assignee: Jiabao Sun
 Fix For: 1.20.0


Flink Kafka connector can't compile with 1.20-SNAPSHOT, see 
https://github.com/apache/flink-connector-kafka/actions/runs/8553981349/job/23438292087?pr=90#step:15:165

Error message is:

{code}
Error:  Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile 
(default-testCompile) on project flink-connector-kafka: Compilation failure
Error:  
/home/runner/work/flink-connector-kafka/flink-connector-kafka/flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBaseTest.java:[60,39]
 org.apache.flink.streaming.util.MockDeserializationSchema is not public in 
org.apache.flink.streaming.util; cannot be accessed from outside package
{code}




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-04-04 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17833989#comment-17833989
 ] 

Jiabao Sun commented on FLINK-25544:


Thanks [~martijnvisser] for reporting this problem.
The visibility of MockDeserializationSchema should not be modified. 
I will check for other changes and create a new ticket to fix it.

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34948) CDC RowType can not convert to flink row type

2024-03-31 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34948?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34948.

  Assignee: Qishang Zhong
Resolution: Fixed

Fixed via cdc master: d099603ef15d9a1ed7ec33718db7ab2438ef1ab5

> CDC RowType can not convert to flink row type
> -
>
> Key: FLINK-34948
> URL: https://issues.apache.org/jira/browse/FLINK-34948
> Project: Flink
>  Issue Type: Bug
>  Components: Flink CDC
>Reporter: Qishang Zhong
>Assignee: Qishang Zhong
>Priority: Critical
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>
> Fix cdc {{RowType}} can not convert to flink type
> I meet the follow exception:
>  
> {code:java}
> java.lang.ArrayStoreException
>     at java.lang.System.arraycopy(Native Method)
>     at java.util.Arrays.copyOf(Arrays.java:3213)
>     at java.util.ArrayList.toArray(ArrayList.java:413)
>     at 
> java.util.Collections$UnmodifiableCollection.toArray(Collections.java:1036) 
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34948) CDC RowType can not convert to flink row type

2024-03-31 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34948?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34948:
---
Priority: Critical  (was: Minor)

> CDC RowType can not convert to flink row type
> -
>
> Key: FLINK-34948
> URL: https://issues.apache.org/jira/browse/FLINK-34948
> Project: Flink
>  Issue Type: Bug
>  Components: Flink CDC
>Reporter: Qishang Zhong
>Priority: Critical
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>
> Fix cdc {{RowType}} can not convert to flink type
> I meet the follow exception:
>  
> {code:java}
> java.lang.ArrayStoreException
>     at java.lang.System.arraycopy(Native Method)
>     at java.util.Arrays.copyOf(Arrays.java:3213)
>     at java.util.ArrayList.toArray(ArrayList.java:413)
>     at 
> java.util.Collections$UnmodifiableCollection.toArray(Collections.java:1036) 
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34958) Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for mongodb connector

2024-03-28 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34958:
---
Affects Version/s: mongodb-1.1.0
   (was: mongodb-1.0.2)

> Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for 
> mongodb connector
> --
>
> Key: FLINK-34958
> URL: https://issues.apache.org/jira/browse/FLINK-34958
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / MongoDB
>Affects Versions: mongodb-1.1.0
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
> Fix For: mongodb-1.1.0
>
>
> Changes:
>  * Add support Flink 1.20-SNAPSHOT
>  * Bump flink-connector-parent to 1.1.0



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34958) Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for mongodb connector

2024-03-28 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34958:
---
Fix Version/s: mongodb-1.2.0
   (was: mongodb-1.1.0)

> Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for 
> mongodb connector
> --
>
> Key: FLINK-34958
> URL: https://issues.apache.org/jira/browse/FLINK-34958
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / MongoDB
>Affects Versions: mongodb-1.1.0
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
> Fix For: mongodb-1.2.0
>
>
> Changes:
>  * Add support Flink 1.20-SNAPSHOT
>  * Bump flink-connector-parent to 1.1.0



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34958) Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for mongodb connector

2024-03-28 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34958:
---
Affects Version/s: mongodb-1.0.2

> Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for 
> mongodb connector
> --
>
> Key: FLINK-34958
> URL: https://issues.apache.org/jira/browse/FLINK-34958
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / MongoDB
>Affects Versions: mongodb-1.0.2
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
> Fix For: mongodb-1.1.0
>
>
> Changes:
>  * Add support Flink 1.20-SNAPSHOT
>  * Bump flink-connector-parent to 1.1.0



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34958) Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for mongodb connector

2024-03-28 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34958.

Fix Version/s: mongodb-1.1.0
   Resolution: Implemented

Implemented via (mongodb:main) 0dc2640922b3dd2d0ea8565d1bf6606b5d715b0b
a0fe686a6647cb6eba3908bbba336079569959e7

> Add support Flink 1.20-SNAPSHOT and bump flink-connector-parent to 1.1.0 for 
> mongodb connector
> --
>
> Key: FLINK-34958
> URL: https://issues.apache.org/jira/browse/FLINK-34958
> Project: Flink
>  Issue Type: Improvement
>  Components: Connectors / MongoDB
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Minor
>  Labels: pull-request-available
> Fix For: mongodb-1.1.0
>
>
> Changes:
>  * Add support Flink 1.20-SNAPSHOT
>  * Bump flink-connector-parent to 1.1.0



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34753) Update outdated MongoDB CDC FAQ in doc

2024-03-28 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34753:
--

Assignee: Xiao Huang

> Update outdated MongoDB CDC FAQ in doc
> --
>
> Key: FLINK-34753
> URL: https://issues.apache.org/jira/browse/FLINK-34753
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Affects Versions: cdc-3.1.0
>Reporter: Xiao Huang
>Assignee: Xiao Huang
>Priority: Minor
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34753) Update outdated MongoDB CDC FAQ in doc

2024-03-28 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34753?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34753.

Resolution: Fixed

Resolved by flink-cdc master: 927a0ec4743ac70c5d4edb811da7ffce09658e8b

> Update outdated MongoDB CDC FAQ in doc
> --
>
> Key: FLINK-34753
> URL: https://issues.apache.org/jira/browse/FLINK-34753
> Project: Flink
>  Issue Type: Improvement
>  Components: Flink CDC
>Affects Versions: cdc-3.1.0
>Reporter: Xiao Huang
>Assignee: Xiao Huang
>Priority: Minor
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34719) StreamRecordTest#testWithTimestamp fails on Azure

2024-03-18 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34719.

Fix Version/s: 1.12.0
   Resolution: Fixed

> StreamRecordTest#testWithTimestamp fails on Azure
> -
>
> Key: FLINK-34719
> URL: https://issues.apache.org/jira/browse/FLINK-34719
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Affects Versions: 1.20.0
>Reporter: Ryan Skraba
>Assignee: Jiabao Sun
>Priority: Major
>  Labels: pull-request-available, test-stability
> Fix For: 1.12.0
>
>
> The ClassCastException *message* expected in 
> StreamRecordTest#testWithTimestamp as well as 
> StreamRecordTest#testWithNoTimestamp fails on JDK 11, 17, and 21
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=f0ac5c25-1168-55a5-07ff-0e88223afed9=50bf7a25-bdc4-5e56-5478-c7b4511dde53=10341]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=675bf62c-8558-587e-2555-dcad13acefb5=5878eed3-cc1e-5b12-1ed0-9e7139ce0992=9828]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=d06b80b4-9e88-5d40-12a2-18072cf60528=609ecd5a-3f6e-5d0c-2239-2096b155a4d0=9833]
> {code:java}
> Expecting throwable message:
> Mar 16 01:35:07   "class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')"
> Mar 16 01:35:07 to contain:
> Mar 16 01:35:07   "cannot be cast to 
> org.apache.flink.streaming.api.watermark.Watermark"
> Mar 16 01:35:07 but did not.
> Mar 16 01:35:07 
> Mar 16 01:35:07 Throwable that failed the check:
> Mar 16 01:35:07 
> Mar 16 01:35:07 java.lang.ClassCastException: class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')
> Mar 16 01:35:07   at 
> org.apache.flink.streaming.runtime.streamrecord.StreamElement.asWatermark(StreamElement.java:92)
> Mar 16 01:35:07   at 
> org.assertj.core.api.ThrowableAssert.catchThrowable(ThrowableAssert.java:63)
> Mar 16 01:35:07   at 
> org.assertj.core.api.AssertionsForClassTypes.catchThrowable(AssertionsForClassTypes.java:892)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34719) StreamRecordTest#testWithTimestamp fails on Azure

2024-03-18 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17828129#comment-17828129
 ] 

Jiabao Sun commented on FLINK-34719:


Fixed via master: 8ec5e7e830b5bda30ead3638a1faa3567d80bb7b

> StreamRecordTest#testWithTimestamp fails on Azure
> -
>
> Key: FLINK-34719
> URL: https://issues.apache.org/jira/browse/FLINK-34719
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Affects Versions: 1.20.0
>Reporter: Ryan Skraba
>Assignee: Jiabao Sun
>Priority: Major
>  Labels: pull-request-available, test-stability
>
> The ClassCastException *message* expected in 
> StreamRecordTest#testWithTimestamp as well as 
> StreamRecordTest#testWithNoTimestamp fails on JDK 11, 17, and 21
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=f0ac5c25-1168-55a5-07ff-0e88223afed9=50bf7a25-bdc4-5e56-5478-c7b4511dde53=10341]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=675bf62c-8558-587e-2555-dcad13acefb5=5878eed3-cc1e-5b12-1ed0-9e7139ce0992=9828]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=d06b80b4-9e88-5d40-12a2-18072cf60528=609ecd5a-3f6e-5d0c-2239-2096b155a4d0=9833]
> {code:java}
> Expecting throwable message:
> Mar 16 01:35:07   "class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')"
> Mar 16 01:35:07 to contain:
> Mar 16 01:35:07   "cannot be cast to 
> org.apache.flink.streaming.api.watermark.Watermark"
> Mar 16 01:35:07 but did not.
> Mar 16 01:35:07 
> Mar 16 01:35:07 Throwable that failed the check:
> Mar 16 01:35:07 
> Mar 16 01:35:07 java.lang.ClassCastException: class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')
> Mar 16 01:35:07   at 
> org.apache.flink.streaming.runtime.streamrecord.StreamElement.asWatermark(StreamElement.java:92)
> Mar 16 01:35:07   at 
> org.assertj.core.api.ThrowableAssert.catchThrowable(ThrowableAssert.java:63)
> Mar 16 01:35:07   at 
> org.assertj.core.api.AssertionsForClassTypes.catchThrowable(AssertionsForClassTypes.java:892)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34719) StreamRecordTest#testWithTimestamp fails on Azure

2024-03-18 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34719:
--

Assignee: Jiabao Sun

> StreamRecordTest#testWithTimestamp fails on Azure
> -
>
> Key: FLINK-34719
> URL: https://issues.apache.org/jira/browse/FLINK-34719
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Affects Versions: 1.20.0
>Reporter: Ryan Skraba
>Assignee: Jiabao Sun
>Priority: Major
>  Labels: test-stability
>
> The ClassCastException *message* expected in 
> StreamRecordTest#testWithTimestamp as well as 
> StreamRecordTest#testWithNoTimestamp fails on JDK 11, 17, and 21
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=f0ac5c25-1168-55a5-07ff-0e88223afed9=50bf7a25-bdc4-5e56-5478-c7b4511dde53=10341]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=675bf62c-8558-587e-2555-dcad13acefb5=5878eed3-cc1e-5b12-1ed0-9e7139ce0992=9828]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=d06b80b4-9e88-5d40-12a2-18072cf60528=609ecd5a-3f6e-5d0c-2239-2096b155a4d0=9833]
> {code:java}
> Expecting throwable message:
> Mar 16 01:35:07   "class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')"
> Mar 16 01:35:07 to contain:
> Mar 16 01:35:07   "cannot be cast to 
> org.apache.flink.streaming.api.watermark.Watermark"
> Mar 16 01:35:07 but did not.
> Mar 16 01:35:07 
> Mar 16 01:35:07 Throwable that failed the check:
> Mar 16 01:35:07 
> Mar 16 01:35:07 java.lang.ClassCastException: class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')
> Mar 16 01:35:07   at 
> org.apache.flink.streaming.runtime.streamrecord.StreamElement.asWatermark(StreamElement.java:92)
> Mar 16 01:35:07   at 
> org.assertj.core.api.ThrowableAssert.catchThrowable(ThrowableAssert.java:63)
> Mar 16 01:35:07   at 
> org.assertj.core.api.AssertionsForClassTypes.catchThrowable(AssertionsForClassTypes.java:892)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34719) StreamRecordTest#testWithTimestamp fails on Azure

2024-03-18 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17828012#comment-17828012
 ] 

Jiabao Sun commented on FLINK-34719:


Thanks [~rskraba] for reporting this.
I'm looking into this problem.

> StreamRecordTest#testWithTimestamp fails on Azure
> -
>
> Key: FLINK-34719
> URL: https://issues.apache.org/jira/browse/FLINK-34719
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Affects Versions: 1.20.0
>Reporter: Ryan Skraba
>Priority: Major
>  Labels: test-stability
>
> The ClassCastException *message* expected in 
> StreamRecordTest#testWithTimestamp as well as 
> StreamRecordTest#testWithNoTimestamp fails on JDK 11, 17, and 21
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=f0ac5c25-1168-55a5-07ff-0e88223afed9=50bf7a25-bdc4-5e56-5478-c7b4511dde53=10341]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=675bf62c-8558-587e-2555-dcad13acefb5=5878eed3-cc1e-5b12-1ed0-9e7139ce0992=9828]
>  * 
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=58352=logs=d06b80b4-9e88-5d40-12a2-18072cf60528=609ecd5a-3f6e-5d0c-2239-2096b155a4d0=9833]
> {code:java}
> Expecting throwable message:
> Mar 16 01:35:07   "class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')"
> Mar 16 01:35:07 to contain:
> Mar 16 01:35:07   "cannot be cast to 
> org.apache.flink.streaming.api.watermark.Watermark"
> Mar 16 01:35:07 but did not.
> Mar 16 01:35:07 
> Mar 16 01:35:07 Throwable that failed the check:
> Mar 16 01:35:07 
> Mar 16 01:35:07 java.lang.ClassCastException: class 
> org.apache.flink.streaming.runtime.streamrecord.StreamRecord cannot be cast 
> to class org.apache.flink.streaming.api.watermark.Watermark 
> (org.apache.flink.streaming.runtime.streamrecord.StreamRecord and 
> org.apache.flink.streaming.api.watermark.Watermark are in unnamed module of 
> loader 'app')
> Mar 16 01:35:07   at 
> org.apache.flink.streaming.runtime.streamrecord.StreamElement.asWatermark(StreamElement.java:92)
> Mar 16 01:35:07   at 
> org.assertj.core.api.ThrowableAssert.catchThrowable(ThrowableAssert.java:63)
> Mar 16 01:35:07   at 
> org.assertj.core.api.AssertionsForClassTypes.catchThrowable(AssertionsForClassTypes.java:892)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34585) [JUnit5 Migration] Module: Flink CDC

2024-03-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17827462#comment-17827462
 ] 

Jiabao Sun commented on FLINK-34585:


Thanks [~kunni] for volunteering.
Assigned to you.

> [JUnit5 Migration] Module: Flink CDC
> 
>
> Key: FLINK-34585
> URL: https://issues.apache.org/jira/browse/FLINK-34585
> Project: Flink
>  Issue Type: Sub-task
>  Components: Flink CDC
>Reporter: Hang Ruan
>Assignee: LvYanquan
>Priority: Major
>
> Most tests in Flink CDC are still using Junit 4. We need to use Junit 5 
> instead.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34585) [JUnit5 Migration] Module: Flink CDC

2024-03-15 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34585?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34585:
--

Assignee: LvYanquan

> [JUnit5 Migration] Module: Flink CDC
> 
>
> Key: FLINK-34585
> URL: https://issues.apache.org/jira/browse/FLINK-34585
> Project: Flink
>  Issue Type: Sub-task
>  Components: Flink CDC
>Reporter: Hang Ruan
>Assignee: LvYanquan
>Priority: Major
>
> Most tests in Flink CDC are still using Junit 4. We need to use Junit 5 
> instead.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17827460#comment-17827460
 ] 

Jiabao Sun commented on FLINK-25544:


master: 62f44e0118539c1ed0dedf47099326f97c9d0427

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-15 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-25544:
---
Release Note:   (was: master: 62f44e0118539c1ed0dedf47099326f97c9d0427)

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-15 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-25544.

Fix Version/s: 1.20.0
 Release Note: master: 62f44e0118539c1ed0dedf47099326f97c9d0427
   Resolution: Fixed

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-07 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17824476#comment-17824476
 ] 

Jiabao Sun commented on FLINK-25544:


master:
395928901cb99c019d8885d1c39839c33e5ed587
4bba35fa1f02a6a92e0db2d0e131c9a17bf17125
d1954b580020f62e5fdaff6830bccc3e569ce78d
6433aeb955a24fe0402d12bc170b4a9a58207e7e

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-07 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17824373#comment-17824373
 ] 

Jiabao Sun commented on FLINK-25544:


master: 6f7b24817a81995e90cfc2cd77efadb41be8cddc

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34183) Add NOTICE files for Flink CDC project

2024-03-06 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34183.

Fix Version/s: cdc-3.1.0
   Resolution: Implemented

Implement via flink-cdc master: 86272bf1029022adbf6d34132f4b34df14f2ad89

> Add NOTICE files for Flink CDC project
> --
>
> Key: FLINK-34183
> URL: https://issues.apache.org/jira/browse/FLINK-34183
> Project: Flink
>  Issue Type: Sub-task
>  Components: Flink CDC
>Reporter: Leonard Xu
>Assignee: Hang Ruan
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34577) Add IssueNavigationLink for IDEA git log

2024-03-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34577.

Fix Version/s: cdc-3.1.0
   Resolution: Implemented

Implemented via flink-cdc (master): 96888b2ce0a7981ebe5917b6c27deb4015d845d2

> Add IssueNavigationLink for IDEA git log
> 
>
> Key: FLINK-34577
> URL: https://issues.apache.org/jira/browse/FLINK-34577
> Project: Flink
>  Issue Type: Sub-task
>  Components: Flink CDC
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Major
>  Labels: pull-request-available
> Fix For: cdc-3.1.0
>
>
> Add IssueNavigationLink for IDEA git log



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34577) Add IssueNavigationLink for IDEA git log

2024-03-05 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34577:
--

Assignee: Zhongqiang Gong

> Add IssueNavigationLink for IDEA git log
> 
>
> Key: FLINK-34577
> URL: https://issues.apache.org/jira/browse/FLINK-34577
> Project: Flink
>  Issue Type: Sub-task
>  Components: Flink CDC
>Reporter: Zhongqiang Gong
>Assignee: Zhongqiang Gong
>Priority: Major
>  Labels: pull-request-available
>
> Add IssueNavigationLink for IDEA git log



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-03 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17823018#comment-17823018
 ] 

Jiabao Sun commented on FLINK-25544:


Hi [~Thesharing]. 
I assigned this ticket to myself as this ticket hasn't been updated for a long 
time, you can also help to review PR if you have time.

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Zhilong Hong
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-25544) [JUnit5 Migration] Module: flink-streaming-java

2024-03-03 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-25544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-25544:
--

Assignee: Jiabao Sun  (was: Zhilong Hong)

> [JUnit5 Migration] Module: flink-streaming-java
> ---
>
> Key: FLINK-25544
> URL: https://issues.apache.org/jira/browse/FLINK-25544
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Hang Ruan
>Assignee: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34492) fix scala style comment link when migrate scala to java

2024-03-01 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34492.

Resolution: Fixed

master: 46cbf22147d783fb68f77fad95161dc5ef036c96

> fix scala style comment link when migrate scala to java
> ---
>
> Key: FLINK-34492
> URL: https://issues.apache.org/jira/browse/FLINK-34492
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / Planner
>Affects Versions: 1.20.0
>Reporter: Jacky Lau
>Assignee: Jacky Lau
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>
>  
> scala [[org.apache.calcite.rel.rules.CalcMergeRule]]
> java  {@link org.apache.calcite.rel.rules.CalcMergeRule}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34492) fix scala style comment link when migrate scala to java

2024-03-01 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34492:
--

Assignee: Jacky Lau

> fix scala style comment link when migrate scala to java
> ---
>
> Key: FLINK-34492
> URL: https://issues.apache.org/jira/browse/FLINK-34492
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table SQL / Planner
>Affects Versions: 1.20.0
>Reporter: Jacky Lau
>Assignee: Jacky Lau
>Priority: Major
>  Labels: pull-request-available
> Fix For: 1.20.0
>
>
>  
> scala [[org.apache.calcite.rel.rules.CalcMergeRule]]
> java  {@link org.apache.calcite.rel.rules.CalcMergeRule}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Reopened] (FLINK-25537) [JUnit5 Migration] Module: flink-core

2024-02-26 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reopened FLINK-25537:


I noticed that there are remaining test in other packages.
Hi [~Aiden Gong], will you continue to finish it?

> [JUnit5 Migration] Module: flink-core
> -
>
> Key: FLINK-25537
> URL: https://issues.apache.org/jira/browse/FLINK-25537
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Core
>Reporter: Qingsheng Ren
>Assignee: Aiden Gong
>Priority: Minor
>  Labels: pull-request-available, stale-assigned
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-25537) [JUnit5 Migration] Module: flink-core

2024-02-26 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-25537.

Fix Version/s: 1.20.0
   Resolution: Fixed

> [JUnit5 Migration] Module: flink-core
> -
>
> Key: FLINK-25537
> URL: https://issues.apache.org/jira/browse/FLINK-25537
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Core
>Reporter: Qingsheng Ren
>Assignee: Aiden Gong
>Priority: Minor
>  Labels: pull-request-available, stale-assigned
> Fix For: 1.20.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-25537) [JUnit5 Migration] Module: flink-core

2024-02-26 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17820667#comment-17820667
 ] 

Jiabao Sun commented on FLINK-25537:


master: 922cc2ad52203e4c474f3837fcc9a219dd293fa5

> [JUnit5 Migration] Module: flink-core
> -
>
> Key: FLINK-25537
> URL: https://issues.apache.org/jira/browse/FLINK-25537
> Project: Flink
>  Issue Type: Sub-task
>  Components: API / Core
>Reporter: Qingsheng Ren
>Assignee: Aiden Gong
>Priority: Minor
>  Labels: pull-request-available, stale-assigned
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34461) MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17

2024-02-19 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34461.

Fix Version/s: mongodb-1.1.0
   Resolution: Fixed

> MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17
> --
>
> Key: FLINK-34461
> URL: https://issues.apache.org/jira/browse/FLINK-34461
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / MongoDB
>Affects Versions: mongodb-1.1.0
>Reporter: Martijn Visser
>Assignee: Jiabao Sun
>Priority: Critical
>  Labels: test-stability
> Fix For: mongodb-1.1.0
>
>
> The weekly tests for MongoDB consistently time out for the v1.0 branch while 
> testing Flink 1.18.1 for JDK17:
> https://github.com/apache/flink-connector-mongodb/actions/runs/7770329490/job/21190387348
> https://github.com/apache/flink-connector-mongodb/actions/runs/7858349600/job/21443232301
> https://github.com/apache/flink-connector-mongodb/actions/runs/7945225005/job/21691624903



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34461) MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17

2024-02-19 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17818527#comment-17818527
 ] 

Jiabao Sun commented on FLINK-34461:


The reason for this issue is that the v1.0 branch is missing the backport of 
FLINK-33899. 
It has been fixed in PR-28 via 
v1.0 (5a8b0979d79e1da009115cde7375bf28c45c22ad, 
a56c003b8c5aca646e47d4950189b81c9e7e75c3).

Since the main branch has update nightly builds against the latest released 
v1.1 branch which already includes these two commits, the nightly CI will not 
fail.
main (aaf3867b2a72a61a0511f250c36580842623b6bc)

> MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17
> --
>
> Key: FLINK-34461
> URL: https://issues.apache.org/jira/browse/FLINK-34461
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / MongoDB
>Affects Versions: mongodb-1.1.0
>Reporter: Martijn Visser
>Assignee: Jiabao Sun
>Priority: Critical
>  Labels: test-stability
>
> The weekly tests for MongoDB consistently time out for the v1.0 branch while 
> testing Flink 1.18.1 for JDK17:
> https://github.com/apache/flink-connector-mongodb/actions/runs/7770329490/job/21190387348
> https://github.com/apache/flink-connector-mongodb/actions/runs/7858349600/job/21443232301
> https://github.com/apache/flink-connector-mongodb/actions/runs/7945225005/job/21691624903



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Assigned] (FLINK-34461) MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17

2024-02-19 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun reassigned FLINK-34461:
--

Assignee: Jiabao Sun

> MongoDB weekly builds fail with time out on Flink 1.18.1 for JDK17
> --
>
> Key: FLINK-34461
> URL: https://issues.apache.org/jira/browse/FLINK-34461
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / MongoDB
>Affects Versions: mongodb-1.1.0
>Reporter: Martijn Visser
>Assignee: Jiabao Sun
>Priority: Critical
>  Labels: test-stability
>
> The weekly tests for MongoDB consistently time out for the v1.0 branch while 
> testing Flink 1.18.1 for JDK17:
> https://github.com/apache/flink-connector-mongodb/actions/runs/7770329490/job/21190387348
> https://github.com/apache/flink-connector-mongodb/actions/runs/7858349600/job/21443232301
> https://github.com/apache/flink-connector-mongodb/actions/runs/7945225005/job/21691624903



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34214) FLIP-377: Support fine-grained configuration to control filter push down for Table/SQL Sources

2024-02-11 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34214?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17816377#comment-17816377
 ] 

Jiabao Sun commented on FLINK-34214:


Hi [~stayrascal]. 
When pushing to the database without filters, we can iterate through the data 
in batches using primary key indexing or natural order and filter the data 
using external computing resources. This greatly reduces the computational 
overhead on the database for filters that do not hit the indexes.

> FLIP-377: Support fine-grained configuration to control filter push down for 
> Table/SQL Sources
> --
>
> Key: FLINK-34214
> URL: https://issues.apache.org/jira/browse/FLINK-34214
> Project: Flink
>  Issue Type: New Feature
>  Components: Connectors / JDBC, Connectors / MongoDB
>Affects Versions: mongodb-1.0.2, jdbc-3.1.2
>Reporter: jiabao.sun
>Assignee: jiabao.sun
>Priority: Major
> Fix For: jdbc-3.1.3, mongodb-1.2.0
>
>
> This improvement implements [FLIP-377 Support fine-grained configuration to 
> control filter push down for Table/SQL 
> Sources|https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=276105768]
> This FLIP has 2 goals:
>  * Introduces a new configuration filter.handling.policy to the JDBC and 
> MongoDB connector.
>  * Suggests a convention option name if other connectors are going to add an 
> option for the same purpose.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34337) Sink.InitContextWrapper should implement metadataConsumer method

2024-02-01 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-34337:
--

 Summary: Sink.InitContextWrapper should implement metadataConsumer 
method
 Key: FLINK-34337
 URL: https://issues.apache.org/jira/browse/FLINK-34337
 Project: Flink
  Issue Type: Bug
  Components: API / Core
Affects Versions: 1.19.0
Reporter: Jiabao Sun
 Fix For: 1.19.0


Sink.InitContextWrapper should implement metadataConsumer method.

If the metadataConsumer method is not implemented, the behavior of the wrapped 
WriterInitContext's metadataConsumer will be lost.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34259) flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled

2024-01-30 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17812312#comment-17812312
 ] 

Jiabao Sun commented on FLINK-34259:


The PR was reopened, PTAL.

> flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled
> -
>
> Key: FLINK-34259
> URL: https://issues.apache.org/jira/browse/FLINK-34259
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC
>Reporter: Martijn Visser
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 1.19.0
>
>
> https://github.com/apache/flink-connector-jdbc/actions/runs/7682035724/job/20935884874#step:14:150
> {code:java}
> Error:  Tests run: 10, Failures: 5, Errors: 4, Skipped: 0, Time elapsed: 
> 7.909 s <<< FAILURE! - in 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest
> Error:  
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat
>   Time elapsed: 3.254 s  <<< ERROR!
> java.lang.NullPointerException: Cannot invoke 
> "org.apache.flink.api.common.serialization.SerializerConfig.hasGenericTypesDisabled()"
>  because "config" is null
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:85)
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:99)
>   at 
> org.apache.flink.connector.jdbc.JdbcTestBase.getSerializer(JdbcTestBase.java:70)
>   at 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat(JdbcRowOutputFormatTest.java:336)
>   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
> {code}
> Seems to be caused by FLINK-34122 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34259) flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled

2024-01-30 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17812299#comment-17812299
 ] 

Jiabao Sun commented on FLINK-34259:


[~martijnvisser] But I still have a question. In the previous changes by 
[FLINK-34090], there was no change in the compatibility of the public 
interface. Normally, when creating an ExecutionConfig object through the 
constructor of ExecutionConfig, a SerializerConfig object is also created, so 
the issue of NPE being thrown by hasGenericTypesDisabled should not occur. The 
NPE exception thrown in the JDBC connector test is mainly because the 
ExecutionConfig is mocked using Mockito, so 
serializerConfig.hasGenericTypesDisabled() will throw NPE. I'm not sure if this 
qualifies as breaking the public interface.

> flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled
> -
>
> Key: FLINK-34259
> URL: https://issues.apache.org/jira/browse/FLINK-34259
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC
>Reporter: Martijn Visser
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 1.19.0
>
>
> https://github.com/apache/flink-connector-jdbc/actions/runs/7682035724/job/20935884874#step:14:150
> {code:java}
> Error:  Tests run: 10, Failures: 5, Errors: 4, Skipped: 0, Time elapsed: 
> 7.909 s <<< FAILURE! - in 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest
> Error:  
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat
>   Time elapsed: 3.254 s  <<< ERROR!
> java.lang.NullPointerException: Cannot invoke 
> "org.apache.flink.api.common.serialization.SerializerConfig.hasGenericTypesDisabled()"
>  because "config" is null
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:85)
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:99)
>   at 
> org.apache.flink.connector.jdbc.JdbcTestBase.getSerializer(JdbcTestBase.java:70)
>   at 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat(JdbcRowOutputFormatTest.java:336)
>   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
> {code}
> Seems to be caused by FLINK-34122 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34259) flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled

2024-01-30 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17812293#comment-17812293
 ] 

Jiabao Sun commented on FLINK-34259:


Thanks [~martijnvisser] , I will close the PR for the JDBC connector.

> flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled
> -
>
> Key: FLINK-34259
> URL: https://issues.apache.org/jira/browse/FLINK-34259
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC
>Reporter: Martijn Visser
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 1.19.0
>
>
> https://github.com/apache/flink-connector-jdbc/actions/runs/7682035724/job/20935884874#step:14:150
> {code:java}
> Error:  Tests run: 10, Failures: 5, Errors: 4, Skipped: 0, Time elapsed: 
> 7.909 s <<< FAILURE! - in 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest
> Error:  
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat
>   Time elapsed: 3.254 s  <<< ERROR!
> java.lang.NullPointerException: Cannot invoke 
> "org.apache.flink.api.common.serialization.SerializerConfig.hasGenericTypesDisabled()"
>  because "config" is null
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:85)
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:99)
>   at 
> org.apache.flink.connector.jdbc.JdbcTestBase.getSerializer(JdbcTestBase.java:70)
>   at 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat(JdbcRowOutputFormatTest.java:336)
>   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
> {code}
> Seems to be caused by FLINK-34122 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Comment Edited] (FLINK-34259) flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled

2024-01-30 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17812291#comment-17812291
 ] 

Jiabao Sun edited comment on FLINK-34259 at 1/30/24 12:45 PM:
--

[~martijnvisser]
It seems introduced by [FLINK-34090]
It doesn't seem to break public interfaces, and I think we only need to make 
some adjustments in the testing of the JDBC connector.


was (Author: jiabao.sun):
[~martijnvisser]
It seems introduced by 
[FLINK-34090](https://issues.apache.org/jira/browse/FLINK-34122) 
It doesn't seem to break public interfaces, and I think we only need to make 
some adjustments in the testing of the JDBC connector.

> flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled
> -
>
> Key: FLINK-34259
> URL: https://issues.apache.org/jira/browse/FLINK-34259
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC
>Reporter: Martijn Visser
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 1.19.0
>
>
> https://github.com/apache/flink-connector-jdbc/actions/runs/7682035724/job/20935884874#step:14:150
> {code:java}
> Error:  Tests run: 10, Failures: 5, Errors: 4, Skipped: 0, Time elapsed: 
> 7.909 s <<< FAILURE! - in 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest
> Error:  
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat
>   Time elapsed: 3.254 s  <<< ERROR!
> java.lang.NullPointerException: Cannot invoke 
> "org.apache.flink.api.common.serialization.SerializerConfig.hasGenericTypesDisabled()"
>  because "config" is null
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:85)
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:99)
>   at 
> org.apache.flink.connector.jdbc.JdbcTestBase.getSerializer(JdbcTestBase.java:70)
>   at 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat(JdbcRowOutputFormatTest.java:336)
>   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
> {code}
> Seems to be caused by FLINK-34122 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34259) flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled

2024-01-30 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17812291#comment-17812291
 ] 

Jiabao Sun commented on FLINK-34259:


[~martijnvisser]
It seems introduced by 
[FLINK-34090](https://issues.apache.org/jira/browse/FLINK-34122) 
It doesn't seem to break public interfaces, and I think we only need to make 
some adjustments in the testing of the JDBC connector.

> flink-connector-jdbc fails to compile with NPE on hasGenericTypesDisabled
> -
>
> Key: FLINK-34259
> URL: https://issues.apache.org/jira/browse/FLINK-34259
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / JDBC
>Reporter: Martijn Visser
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: 1.19.0
>
>
> https://github.com/apache/flink-connector-jdbc/actions/runs/7682035724/job/20935884874#step:14:150
> {code:java}
> Error:  Tests run: 10, Failures: 5, Errors: 4, Skipped: 0, Time elapsed: 
> 7.909 s <<< FAILURE! - in 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest
> Error:  
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat
>   Time elapsed: 3.254 s  <<< ERROR!
> java.lang.NullPointerException: Cannot invoke 
> "org.apache.flink.api.common.serialization.SerializerConfig.hasGenericTypesDisabled()"
>  because "config" is null
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:85)
>   at 
> org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:99)
>   at 
> org.apache.flink.connector.jdbc.JdbcTestBase.getSerializer(JdbcTestBase.java:70)
>   at 
> org.apache.flink.connector.jdbc.JdbcRowOutputFormatTest.testInvalidConnectionInJdbcOutputFormat(JdbcRowOutputFormatTest.java:336)
>   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
>   at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
> {code}
> Seems to be caused by FLINK-34122 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34260) Update flink-connector-aws to be compatible with updated SinkV2 interfaces

2024-01-29 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34260?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17811941#comment-17811941
 ] 

Jiabao Sun commented on FLINK-34260:


Sorry [~dannycranmer], I just noticed this comment.
I made some attempts in this 
[PR-126|https://github.com/apache/flink-connector-aws/pull/126] , and it should 
work. 
However, since Aleksandr is also working on it, I closed it.
This change may not be the best solution, but I hope it can be helpful.

> Update flink-connector-aws to be compatible with updated SinkV2 interfaces
> --
>
> Key: FLINK-34260
> URL: https://issues.apache.org/jira/browse/FLINK-34260
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / AWS
>Affects Versions: aws-connector-4.3.0
>Reporter: Martijn Visser
>Assignee: Aleksandr Pilipenko
>Priority: Blocker
>  Labels: pull-request-available
>
> https://github.com/apache/flink-connector-aws/actions/runs/7689300085/job/20951547366#step:9:798
> {code:java}
> Error:  Failed to execute goal 
> org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile 
> (default-testCompile) on project flink-connector-dynamodb: Compilation failure
> Error:  
> /home/runner/work/flink-connector-aws/flink-connector-aws/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/DynamoDbSinkWriterTest.java:[357,40]
>  incompatible types: 
> org.apache.flink.connector.base.sink.writer.TestSinkInitContext cannot be 
> converted to org.apache.flink.api.connector.sink2.Sink.InitContext
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34260) Update flink-connector-aws to be compatible with updated SinkV2 interfaces

2024-01-29 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34260?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17811807#comment-17811807
 ] 

Jiabao Sun commented on FLINK-34260:


Hi [~martijnvisser], can I take this ticket?

> Update flink-connector-aws to be compatible with updated SinkV2 interfaces
> --
>
> Key: FLINK-34260
> URL: https://issues.apache.org/jira/browse/FLINK-34260
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / AWS
>Affects Versions: aws-connector-4.3.0
>Reporter: Martijn Visser
>Priority: Blocker
>
> https://github.com/apache/flink-connector-aws/actions/runs/7689300085/job/20951547366#step:9:798
> {code:java}
> Error:  Failed to execute goal 
> org.apache.maven.plugins:maven-compiler-plugin:3.8.0:testCompile 
> (default-testCompile) on project flink-connector-dynamodb: Compilation failure
> Error:  
> /home/runner/work/flink-connector-aws/flink-connector-aws/flink-connector-aws/flink-connector-dynamodb/src/test/java/org/apache/flink/connector/dynamodb/sink/DynamoDbSinkWriterTest.java:[357,40]
>  incompatible types: 
> org.apache.flink.connector.base.sink.writer.TestSinkInitContext cannot be 
> converted to org.apache.flink.api.connector.sink2.Sink.InitContext
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Comment Edited] (FLINK-34087) [JUnit5 Migration] Module: flink-dist

2024-01-26 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17811158#comment-17811158
 ] 

Jiabao Sun edited comment on FLINK-34087 at 1/26/24 8:10 AM:
-

Resolved via bcd448b2f1efecc701079c1a0f7f565a59817f22


was (Author: jiabao.sun):
Resolved by FLIK-33577

> [JUnit5 Migration] Module: flink-dist
> -
>
> Key: FLINK-34087
> URL: https://issues.apache.org/jira/browse/FLINK-34087
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Resolved] (FLINK-34087) [JUnit5 Migration] Module: flink-dist

2024-01-26 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34087?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun resolved FLINK-34087.

Resolution: Duplicate

Resolved by FLIK-33577

> [JUnit5 Migration] Module: flink-dist
> -
>
> Key: FLINK-34087
> URL: https://issues.apache.org/jira/browse/FLINK-34087
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Jiabao Sun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34113) Update flink-connector-elasticsearch to be compatible with updated SinkV2 interfaces

2024-01-25 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17811151#comment-17811151
 ] 

Jiabao Sun commented on FLINK-34113:


[~martijnvisser] We can use TestSinkInitContext instead of MockInitContext to 
resolve this compilation problem.
It is possible to make the elasticsearch connector still compatible with 
version 1.18.

> Update flink-connector-elasticsearch to be compatible with updated SinkV2 
> interfaces
> 
>
> Key: FLINK-34113
> URL: https://issues.apache.org/jira/browse/FLINK-34113
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Connectors / ElasticSearch
>Reporter: Martijn Visser
>Priority: Blocker
>  Labels: pull-request-available
> Fix For: elasticsearch-3.2.0
>
>
> Make sure that the connector is updated to deal with the new changes 
> introduced in FLINK-33973
> See also 
> https://github.com/apache/flink-connector-elasticsearch/actions/runs/7539688654/job/20522689108#step:14:159
>  for details on the current failure
> This means that the new Elasticsearch connector will be compatible only with 
> Flink 1.19, with (the upcoming) v3.1.0 being compatible with only Flink 1.18



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34216) Support fine-grained configuration to control filter push down for MongoDB Connector

2024-01-23 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-34216:
--

 Summary: Support fine-grained configuration to control filter push 
down for MongoDB Connector
 Key: FLINK-34216
 URL: https://issues.apache.org/jira/browse/FLINK-34216
 Project: Flink
  Issue Type: Sub-task
  Components: Connectors / MongoDB
Affects Versions: mongodb-1.0.2
Reporter: Jiabao Sun
 Fix For: mongodb-1.1.0


Support fine-grained configuration to control filter push down for MongoDB 
Connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34215) Support fine-grained configuration to control filter push down for JDBC Connector

2024-01-23 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-34215:
--

 Summary: Support fine-grained configuration to control filter push 
down for JDBC Connector
 Key: FLINK-34215
 URL: https://issues.apache.org/jira/browse/FLINK-34215
 Project: Flink
  Issue Type: Sub-task
  Components: Connectors / JDBC
Affects Versions: jdbc-3.1.2
Reporter: Jiabao Sun
 Fix For: jdbc-3.1.3


Support fine-grained configuration to control filter push down for JDBC 
Connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34214) FLIP-377: Support fine-grained configuration to control filter push down for Table/SQL Sources

2024-01-23 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-34214:
--

 Summary: FLIP-377: Support fine-grained configuration to control 
filter push down for Table/SQL Sources
 Key: FLINK-34214
 URL: https://issues.apache.org/jira/browse/FLINK-34214
 Project: Flink
  Issue Type: New Feature
  Components: Connectors / JDBC, Connectors / MongoDB
Affects Versions: mongodb-1.0.2, jdbc-3.1.2
Reporter: Jiabao Sun
 Fix For: mongodb-1.1.0, jdbc-3.1.3


This improvement implements [FLIP-377 Support fine-grained configuration to 
control filter push down for Table/SQL 
Sources|https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=276105768]

This FLIP has 2 goals:
 * Introduces a new configuration filter.handling.policy to the JDBC and 
MongoDB connector.
 * Suggests a convention option name if other connectors are going to add an 
option for the same purpose.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-31724) SqlClientITCase.testMatchRecognize fails with "bash -c rm -rf /opt/flink/checkpoint/*" returned non-zero exit code 1

2024-01-22 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-31724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17809489#comment-17809489
 ] 

Jiabao Sun commented on FLINK-31724:


https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=56285=logs=af184cdd-c6d8-5084-0b69-7e9c67b35f7a=0f3adb59-eefa-51c6-2858-3654d9e0749d=15295

> SqlClientITCase.testMatchRecognize fails with "bash -c rm -rf 
> /opt/flink/checkpoint/*" returned non-zero exit code 1
> 
>
> Key: FLINK-31724
> URL: https://issues.apache.org/jira/browse/FLINK-31724
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Client
>Affects Versions: 1.18.0, 1.19.0
>Reporter: Sergey Nuyanzin
>Assignee: Qingsheng Ren
>Priority: Major
>  Labels: stale-assigned, test-stability
>
> [https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=47893=logs=af184cdd-c6d8-5084-0b69-7e9c67b35f7a=160c9ae5-96fd-516e-1c91-deb81f59292a=12715]
> {noformat}
> 2023-04-04T08:11:47.8601739Z Apr 04 08:11:47 [ERROR] Tests run: 3, Failures: 
> 0, Errors: 2, Skipped: 0, Time elapsed: 205.218 s <<< FAILURE! - in 
> SqlClientITCase
> 2023-04-04T08:11:47.8736401Z Apr 04 08:11:47 [ERROR] 
> SqlClientITCase.testMatchRecognize  Time elapsed: 42.257 s  <<< ERROR!
> 2023-04-04T08:11:47.8736940Z Apr 04 08:11:47 java.lang.IllegalStateException: 
> 2023-04-04T08:11:47.8737556Z Apr 04 08:11:47 Command "bash -c rm -rf 
> /opt/flink/checkpoint/*" returned non-zero exit code 1. 
> 2023-04-04T08:11:47.8737861Z Apr 04 08:11:47 STDOUT: 
> 2023-04-04T08:11:47.8738297Z Apr 04 08:11:47 STDERR: rm: cannot remove 
> '/opt/flink/checkpoint/e2b7cbfc940e5f066e587037f80e74af': Directory not empty
> 2023-04-04T08:11:47.8738611Z Apr 04 08:11:47 
> 2023-04-04T08:11:47.8738971Z Apr 04 08:11:47  at 
> org.apache.flink.connector.testframe.container.FlinkContainers.deleteJobManagerTemporaryFiles(FlinkContainers.java:471)
> 2023-04-04T08:11:47.8740127Z Apr 04 08:11:47  at 
> org.apache.flink.connector.testframe.container.FlinkContainers.stop(FlinkContainers.java:241)
> 2023-04-04T08:11:47.8740803Z Apr 04 08:11:47  at 
> SqlClientITCase.tearDown(SqlClientITCase.java:114)
> 2023-04-04T08:11:47.8741144Z Apr 04 08:11:47  at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 2023-04-04T08:11:47.8741677Z Apr 04 08:11:47  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 2023-04-04T08:11:47.8742090Z Apr 04 08:11:47  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 2023-04-04T08:11:47.8742463Z Apr 04 08:11:47  at 
> java.lang.reflect.Method.invoke(Method.java:498)
> 2023-04-04T08:11:47.8742825Z Apr 04 08:11:47  at 
> org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:727)
> 2023-04-04T08:11:47.8743253Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
> 2023-04-04T08:11:47.8743709Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
> 2023-04-04T08:11:47.873Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:156)
> 2023-04-04T08:11:47.8744880Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.extension.TimeoutExtension.interceptLifecycleMethod(TimeoutExtension.java:128)
> 2023-04-04T08:11:47.8745318Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.extension.TimeoutExtension.interceptAfterEachMethod(TimeoutExtension.java:110)
> 2023-04-04T08:11:47.8745812Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InterceptingExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(InterceptingExecutableInvoker.java:103)
> 2023-04-04T08:11:47.8746540Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InterceptingExecutableInvoker.lambda$invoke$0(InterceptingExecutableInvoker.java:93)
> 2023-04-04T08:11:47.8747033Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
> 2023-04-04T08:11:47.8747515Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
> 2023-04-04T08:11:47.8747969Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
> 2023-04-04T08:11:47.8748418Z Apr 04 08:11:47  at 
> org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
> 2023-04-04T08:11:47.8748845Z Apr 04 08:11:47  at 
> 

[jira] [Commented] (FLINK-34156) Move Flink Calcite rules from Scala to Java

2024-01-19 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17808635#comment-17808635
 ] 

Jiabao Sun commented on FLINK-34156:


Thanks [~Sergey Nuyanzin].
If there is a need for assistance, please feel free to ping me at any time.

> Move Flink Calcite rules from Scala to Java
> ---
>
> Key: FLINK-34156
> URL: https://issues.apache.org/jira/browse/FLINK-34156
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Table SQL / Planner
>Reporter: Sergey Nuyanzin
>Assignee: Sergey Nuyanzin
>Priority: Major
> Fix For: 2.0.0
>
>
> This is an umbrella task for migration of Calcite rules from Scala to Java 
> mentioned at https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> The reason is that since 1.28.0 ( CALCITE-4787 - Move core to use Immutables 
> instead of ImmutableBeans ) Calcite started to use Immutables 
> (https://immutables.github.io/) and since 1.29.0 removed ImmutableBeans ( 
> CALCITE-4839 - Remove remnants of ImmutableBeans post 1.28 release ). All 
> rule configuration related api which is not Immutables based is marked as 
> deprecated. Since Immutables implies code generation while java compilation 
> it is seems impossible to use for rules in Scala code.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34156) Move Flink Calcite rules from Scala to Java

2024-01-18 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34156?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17808425#comment-17808425
 ] 

Jiabao Sun commented on FLINK-34156:


Hi [~Sergey Nuyanzin], can I help with this task?

> Move Flink Calcite rules from Scala to Java
> ---
>
> Key: FLINK-34156
> URL: https://issues.apache.org/jira/browse/FLINK-34156
> Project: Flink
>  Issue Type: Technical Debt
>  Components: Table SQL / Planner
>Reporter: Sergey Nuyanzin
>Assignee: Sergey Nuyanzin
>Priority: Major
> Fix For: 2.0.0
>
>
> This is an umbrella task for migration of Calcite rules from Scala to Java 
> mentioned at https://cwiki.apache.org/confluence/display/FLINK/2.0+Release
> The reason is that since 1.28.0 ( CALCITE-4787 - Move core to use Immutables 
> instead of ImmutableBeans ) Calcite started to use Immutables 
> (https://immutables.github.io/) and since 1.29.0 removed ImmutableBeans ( 
> CALCITE-4839 - Remove remnants of ImmutableBeans post 1.28 release ). All 
> rule configuration related api which is not Immutables based is marked as 
> deprecated. Since Immutables implies code generation while java compilation 
> it is seems impossible to use for rules in Scala code.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-33717) Cleanup the usage of deprecated StreamTableEnvironment#fromDataStream

2024-01-17 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-33717?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17808053#comment-17808053
 ] 

Jiabao Sun commented on FLINK-33717:


Hi [~jackylau], are you still working on this?
Could I help finished this task?

> Cleanup the usage of deprecated StreamTableEnvironment#fromDataStream
> -
>
> Key: FLINK-33717
> URL: https://issues.apache.org/jira/browse/FLINK-33717
> Project: Flink
>  Issue Type: Sub-task
>Reporter: Jane Chan
>Assignee: Jacky Lau
>Priority: Major
>
> {code:java}
> PythonScalarFunctionOperatorTestBase
> AvroTypesITCase {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34076:
---
Attachment: screenshot-4.png

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png, screenshot-3.png, 
> screenshot-4.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806783#comment-17806783
 ] 

Jiabao Sun commented on FLINK-34076:


Hey [~dannycranmer], it's in dependencies.

 !screenshot-4.png! 

 

 

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png, screenshot-3.png, 
> screenshot-4.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806747#comment-17806747
 ] 

Jiabao Sun commented on FLINK-34076:


flink-connector-base has already been included as a parent dependency, and the 
submodules will also inherit this dependency.

[https://github.com/apache/flink-connector-aws/blob/38aafb44d3a8200e4ff41d87e0780338f40da258/pom.xml#L141-L146]

[https://repo1.maven.org/maven2/org/apache/flink/flink-connector-kinesis/4.2.0-1.18/flink-connector-kinesis-4.2.0-1.18.pom]

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png, screenshot-3.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806723#comment-17806723
 ] 

Jiabao Sun commented on FLINK-34076:


The purpose of "Stop bundling connector-base in externalized connectors" is to 
prevent external connectors from having compile-time dependencies on specific 
versions of flink-connector-base, but instead obtain it at runtime from 
flink-dist to achieve better compatibility.

In fact, users do not need to explicitly declare the flink-connector-base 
dependency in the pom.xml file. It is already included in the dependency tree 
of the Kinesis connector and declared with a "provided" scope. Users only need 
to add the provided dependency to the classpath to run the testing.

 !screenshot-3.png! 

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png, screenshot-3.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34076:
---
Attachment: screenshot-3.png

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png, screenshot-3.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-33816) SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain failed due async checkpoint triggering not being completed

2024-01-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-33816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806705#comment-17806705
 ] 

Jiabao Sun commented on FLINK-33816:


Resuming the first breakpoint will unblock the main thread and trigger a 
assertion on whether the CompletableFuture is completed. However, at this 
point, the result has not been returned, so the CompletableFuture seen by the 
main thread is still in an incomplete state.

> SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain failed due 
> async checkpoint triggering not being completed 
> -
>
> Key: FLINK-33816
> URL: https://issues.apache.org/jira/browse/FLINK-33816
> Project: Flink
>  Issue Type: Sub-task
>  Components: Runtime / Checkpointing, Runtime / Coordination
>Affects Versions: 1.19.0
>Reporter: Matthias Pohl
>Priority: Major
>  Labels: github-actions, pull-request-available, test-stability
> Attachments: screenshot-1.png
>
>
> [https://github.com/XComp/flink/actions/runs/7182604625/job/19559947894#step:12:9430]
> {code:java}
> rror: 14:39:01 14:39:01.930 [ERROR] Tests run: 16, Failures: 1, Errors: 0, 
> Skipped: 0, Time elapsed: 1.878 s <<< FAILURE! - in 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTaskTest
> 9426Error: 14:39:01 14:39:01.930 [ERROR] 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain
>   Time elapsed: 0.034 s  <<< FAILURE!
> 9427Dec 12 14:39:01 org.opentest4j.AssertionFailedError: 
> 9428Dec 12 14:39:01 
> 9429Dec 12 14:39:01 Expecting value to be true but was false
> 9430Dec 12 14:39:01   at 
> java.base/jdk.internal.reflect.DirectConstructorHandleAccessor.newInstance(DirectConstructorHandleAccessor.java:62)
> 9431Dec 12 14:39:01   at 
> java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:502)
> 9432Dec 12 14:39:01   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain(SourceStreamTaskTest.java:710)
> 9433Dec 12 14:39:01   at 
> java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
> 9434Dec 12 14:39:01   at 
> java.base/java.lang.reflect.Method.invoke(Method.java:580)
> [...] {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34087) [JUnit5 Migration] Module: flink-dist

2024-01-15 Thread Jiabao Sun (Jira)
Jiabao Sun created FLINK-34087:
--

 Summary: [JUnit5 Migration] Module: flink-dist
 Key: FLINK-34087
 URL: https://issues.apache.org/jira/browse/FLINK-34087
 Project: Flink
  Issue Type: Sub-task
  Components: Tests
Reporter: Jiabao Sun






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806677#comment-17806677
 ] 

Jiabao Sun commented on FLINK-34076:


Hi [~khanhvu],

The flink-connector-base is already included in flink-dist and we will not 
package it in the externalized connectors[1].
The dependencies of "provided" will also be passed on. You just need to check 
"add dependencies with 'provided' scope to classpath" in IDEA.


{code:xml}

1.17.0

4.2.0-1.17
2.12




org.apache.flink
flink-connector-kinesis
${flink.connector.kinesis.version}


org.apache.flink
flink-streaming-java
${flink.version}
provided


org.apache.flink
flink-table-api-java-bridge
${flink.version}
provided


org.apache.flink
flink-table-planner-loader
${flink.version}
provided


org.apache.flink
flink-table-runtime
${flink.version}
provided


{code}

 !screenshot-2.png! 

[1] https://issues.apache.org/jira/browse/FLINK-30400

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-15 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34076:
---
Attachment: screenshot-2.png

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png, screenshot-2.png
>
>
> The following issue encounters with flink-kinesis-connector v4.2.0, Flink 
> 1.17, it's working properly with kinesis connector v4.1.0 (I have not tested 
> version pre v4.1.0).
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-14 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806605#comment-17806605
 ] 

Jiabao Sun commented on FLINK-34076:


Hi [~khanhvu], do you add dependencies with "provided" scope to classpath?
I can run correctly locally.


{code:xml}


org.apache.flink
flink-connector-kinesis
${flink.connector.kinesis.version}


org.apache.flink
flink-connector-base
${flink.version}
provided


org.apache.flink
flink-streaming-java
${flink.version}
provided


org.apache.flink
flink-table-api-java-bridge
${flink.version}
provided


org.apache.flink
flink-table-planner-loader
${flink.version}
provided


org.apache.flink
flink-table-runtime
${flink.version}
provided


{code}


> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png
>
>
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-14 Thread Jiabao Sun (Jira)


[ https://issues.apache.org/jira/browse/FLINK-34076 ]


Jiabao Sun deleted comment on FLINK-34076:


was (Author: jiabao.sun):
Hi [~khanhvu], do you add dependencies with "provided" scope to classpath?

 !screenshot-1.png! 

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png
>
>
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Updated] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-14 Thread Jiabao Sun (Jira)


 [ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiabao Sun updated FLINK-34076:
---
Attachment: screenshot-1.png

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png
>
>
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-34076) flink-connector-base missing fails kinesis table sink to create

2024-01-14 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-34076?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17806603#comment-17806603
 ] 

Jiabao Sun commented on FLINK-34076:


Hi [~khanhvu], do you add dependencies with "provided" scope to classpath?

 !screenshot-1.png! 

> flink-connector-base missing fails kinesis table sink to create
> ---
>
> Key: FLINK-34076
> URL: https://issues.apache.org/jira/browse/FLINK-34076
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kinesis
>Affects Versions: aws-connector-4.2.0
>Reporter: Khanh Vu
>Priority: Major
> Attachments: screenshot-1.png
>
>
> The 
> [commit|https://github.com/apache/flink-connector-aws/commit/01f112bd5a69f95cd5d2a4bc7e08d1ba9a81d56a]
>  which stops bundling `flink-connector-base` with `flink-connector-kinesis` 
> has caused kinesis sink failing to create when using Table API as required 
> classes from `flink-connector-base` are not loaded in runtime.
> E.g. with following depenency only in pom.xml
> {code:java}
>         
>             org.apache.flink
>             flink-connector-kinesis
>             ${flink.connector.kinesis.version}
>         
> {code}
> and a minimal job definition:
> {code:java}
>   public static void main(String[] args) throws Exception {
>   // create data stream environment
>   StreamExecutionEnvironment sEnv = 
> StreamExecutionEnvironment.getExecutionEnvironment();
>   sEnv.setRuntimeMode(RuntimeExecutionMode.STREAMING);
>   StreamTableEnvironment tEnv = 
> StreamTableEnvironment.create(sEnv);
>   Schema a = Schema.newBuilder().column("a", 
> DataTypes.STRING()).build();
>   TableDescriptor descriptor =
>   TableDescriptor.forConnector("kinesis")
>   .schema(a)
>   .format("json")
>   .build();
>   tEnv.createTemporaryTable("sinkTable", descriptor);
>   tEnv.executeSql("CREATE TABLE sinkTable " + 
> descriptor.toString()).print();
>   }
> {code}
> following exception will be thrown:
> {code:java}
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.flink.connector.base.table.AsyncDynamicTableSinkFactory
>   at 
> jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) 
> ~[?:?]
>   at 
> jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  ~[?:?]
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:527) ~[?:?]
>   ... 28 more
> {code}
> The fix is to explicitly specify `flink-connector-base` as dependency of the 
> project:
> {code:java}
>   
>   org.apache.flink
>   flink-connector-kinesis
>   ${flink.connector.kinesis.version}
>   
>   
>   org.apache.flink
>   flink-connector-base
>   ${flink.version}
>   provided
>   
> {code}
> In general, `flink-connector-base` should be pulled in by default when 
> pulling in the kinesis connector, the current separation adds unnecessary 
> hassle to use the connector.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Commented] (FLINK-33816) SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain failed due async checkpoint triggering not being completed

2024-01-01 Thread Jiabao Sun (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-33816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17801662#comment-17801662
 ] 

Jiabao Sun commented on FLINK-33816:


 !screenshot-1.png! 

Add thread suspended breakpoints at StreamTask:1177 and StreamTask:1194.
Then resume the first breakpoint, the error will be reproduced.

> SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain failed due 
> async checkpoint triggering not being completed 
> -
>
> Key: FLINK-33816
> URL: https://issues.apache.org/jira/browse/FLINK-33816
> Project: Flink
>  Issue Type: Sub-task
>  Components: Runtime / Checkpointing, Runtime / Coordination
>Affects Versions: 1.19.0
>Reporter: Matthias Pohl
>Priority: Major
>  Labels: github-actions, test-stability
> Attachments: screenshot-1.png
>
>
> [https://github.com/XComp/flink/actions/runs/7182604625/job/19559947894#step:12:9430]
> {code:java}
> rror: 14:39:01 14:39:01.930 [ERROR] Tests run: 16, Failures: 1, Errors: 0, 
> Skipped: 0, Time elapsed: 1.878 s <<< FAILURE! - in 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTaskTest
> 9426Error: 14:39:01 14:39:01.930 [ERROR] 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain
>   Time elapsed: 0.034 s  <<< FAILURE!
> 9427Dec 12 14:39:01 org.opentest4j.AssertionFailedError: 
> 9428Dec 12 14:39:01 
> 9429Dec 12 14:39:01 Expecting value to be true but was false
> 9430Dec 12 14:39:01   at 
> java.base/jdk.internal.reflect.DirectConstructorHandleAccessor.newInstance(DirectConstructorHandleAccessor.java:62)
> 9431Dec 12 14:39:01   at 
> java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:502)
> 9432Dec 12 14:39:01   at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTaskTest.testTriggeringStopWithSavepointWithDrain(SourceStreamTaskTest.java:710)
> 9433Dec 12 14:39:01   at 
> java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
> 9434Dec 12 14:39:01   at 
> java.base/java.lang.reflect.Method.invoke(Method.java:580)
> [...] {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


  1   2   3   >