> I mean CDC should be handled on the Kafka side. 
What do you mean about that? Do you mean the the Kafka should store the message 
with the cdc format like debezium[1], Canal[2], MaxWell[3], OGG[4]? 

> Or should I need to use Table API 

I'm afraid not. Seems you can still use Flink Datastream API as Table API makes 
no difference for your case. 

BTW, you can try flink cdc [5] 

[1] [ 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/debezium/
 | 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/debezium/
 ] 
[2] [ 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/canal/
 | 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/canal/
 ] 
[3] [ 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/maxwell/
 | 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/maxwell/
 ] 
[4] [ 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/ogg/
 | 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/ogg/
 ] 
[5] [ https://ververica.github.io/flink-cdc-connectors/ | 
https://ververica.github.io/flink-cdc-connectors/ ] 


Best regards, 
Yuxia 


发件人: "Sid" <flinkbyhe...@gmail.com> 
收件人: "User" <user@flink.apache.org> 
发送时间: 星期六, 2022年 6 月 25日 下午 6:32:22 
主题: How to make current application cdc 

Hello, 

I have a current flow where the data from the Flink-Kafka connector is captured 
and processed using Flink Datastream API and stored in Kafka topics. However, I 
would like to make it CDC enabled. I went through an article where it was 
mentioned that it should be handled on the Kafka side while capturing the data. 
I mean CDC should be handled on the Kafka side. Or should I need to use Table 
API? 
So, any ideas/links are much appreciated as I am trying to understand these 
concepts. 

TIA, 
Sid 

Reply via email to