PatrickRen commented on code in PR #739:
URL: https://github.com/apache/flink-web/pull/739#discussion_r1604313283


##########
docs/content/posts/2024-MM-DD-release-cdc-3.1.0.md:
##########
@@ -0,0 +1,67 @@
+---
+title:  "Apache Flink CDC 3.1.0 Release Announcement"
+date: "2024-MM-DDT08:00:00.000Z"
+authors:
+- renqs:
+  name: "Qingsheng Ren"
+  twitter: "renqstuite"
+aliases:
+- /news/2024/MM/DD/release-cdc-3.1.0.html
+---
+
+The Apache Flink community is excited to announce the release of Flink CDC 
3.1.0! This is the first release after the community accepted the donation of 
Flink CDC as a sub-project of Apache Flink, with exciting new features such as 
transform and table merging. The eco-system of Flink CDC keeps expanding, 
including new Kafka and Paimon pipeline sinks and enhancement to existing 
connectors.
+
+We'd like to invite you to check out [Flink CDC 
documentation](https://nightlies.apache.org/flink/flink-cdc-docs-stable) and 
have a try on [the quickstart 
tutorial](https://nightlies.apache.org/flink/flink-cdc-docs-release-3.0/docs/get-started/introduction)
 to explore the world of Flink CDC. Also we encourage you to [download the 
release](https://flink.apache.org/downloads.html) and share your feedback with 
the community through the Flink [mailing 
lists](https://flink.apache.org/community.html#mailing-lists) or 
[JIRA](https://issues.apache.org/jira/browse/flink)! We hope you like the new 
release and we’d be eager to learn about your experience with it.
+
+## Highlights
+
+### Transformation Support in Pipeline
+
+Flink CDC 3.1.0 introduces the ability of making transformations in the CDC 
pipeline. By incorporating a `transform` section within the YAML pipeline 
definitions, users can now easily apply a variety of transformations to data 
change event from source, including projections, calculations, and addition of 
constant columns, enhancing the effectiveness of data intergration pipelines. 
Leveraging an SQL-like syntax for defining these transformations, the new 
feature ensures that users can quickly adapt to and utilize it.
+
+### Table Merging Support
+
+Flink CDC 3.1.0 now suuports merging multiple tables into one by configuring 
`route` in the YAML pipeline definition.  It is a prevalent occurrence where 
business data is partitioned across tables even databases due to the 
substantial volume. By configuring `route`s that mapping multiple tables into 
one, data change events will be merged into the same destination table. 
Moreover, schema changes on source tables will also be applied to the 
destination.
+
+### Connectors
+
+#### Distributions of MySQL / Oracle / OceanBase / Db2 connectors
+
+Unfortunately due to the license incompatibility, we cannot ship JDBC drivers 
of the following connectors together with our binary release:
+
+- Db2
+- MySQL
+- Oracle
+- OceanBase
+
+Please manually download the corresponding JDBC driver into 
`$FLINK_CDC_HOME/lib` and `$FLINK_HOME/lib`, and specify their paths when 
submiting YAML pipelines with `--jar`, or make sure they are under the 
classpath if you are using Flink SQL.

Review Comment:
   Yep I will rewrite this part. There are two ways to involve MySQL JDBC 
driver:
   
   1. Put it into `FLINK_HOME/lib` directly
   2. Download to anywhere locally and specify the path with `--jar`. It is not 
necessarily in `FLINK_CDC_HOME/lib`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to