yuxiqian commented on code in PR #3430: URL: https://github.com/apache/flink-cdc/pull/3430#discussion_r1659427878
########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. Review Comment: Well it seems a little arbitrary. I could notice that CI uses Maven 3.8.6 but latest 3.1.1 release was compiled with 3.9.1. Not sure if all Maven 3+ works... ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. Review Comment: Well seems Maven version isn't always consistent. I noticed that CI uses Maven 3.8.6 but latest 3.1.1 release was compiled with 3.9.1. Not sure if all Maven 3+ works. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. +You can download the RC version of the source code package and compile it using the `mvn clean package -Dfast` command, +and check if there's any unexpected errors or warnings. + +2. Verify if tarball checksum matches. + +To ensure the genuinity and integrity of released binary packages, a SHA512 hash value of the corresponding file is attached to any released binary tarball so that users can verify the integrity. +You can download the binary tarball of the RC version and calculate its SHA512 hash value with the following command: + +* Linux: `sha512sum flink-cdc-*-bin.tar.gz` +* macOS: `shasum -a 512 flink-cdc-*-bin.tar.gz` +* Windows (PowerShell): `Get-FileHash flink-cdc-*-bin.tar.gz -Algorithm SHA512 | Format-List` + +3. Verify that the binary package was compiled with JDK 8. + +Unpack the precompiled binary jar package and check if the `Build-Jdk` entry in the `META-INF\MANIFEST.MF` file is correct. + +4. Run migration tests. + +Flink CDC tries to ensure backward compatibility of state, that is, the job state (Checkpoint/Savepoint) saved with previous CDC version should be usable in the new version. +You can run CDC migration verification locally with [Flink CDC Migration Test Utils](https://github.com/yuxiqian/migration-test) script. + +* [Pipeline Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/README.md) +* [DataStream Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/datastream/README.md) Review Comment: I would prefer putting it in CDC repository along with existing CI workflows since it isn't significant enough to get an independent repo. But before all I need to clean up scripts before it could be accepted by any upstream repositories. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. +You can download the RC version of the source code package and compile it using the `mvn clean package -Dfast` command, +and check if there's any unexpected errors or warnings. + +2. Verify if tarball checksum matches. + +To ensure the genuinity and integrity of released binary packages, a SHA512 hash value of the corresponding file is attached to any released binary tarball so that users can verify the integrity. +You can download the binary tarball of the RC version and calculate its SHA512 hash value with the following command: + +* Linux: `sha512sum flink-cdc-*-bin.tar.gz` +* macOS: `shasum -a 512 flink-cdc-*-bin.tar.gz` +* Windows (PowerShell): `Get-FileHash flink-cdc-*-bin.tar.gz -Algorithm SHA512 | Format-List` + +3. Verify that the binary package was compiled with JDK 8. + +Unpack the precompiled binary jar package and check if the `Build-Jdk` entry in the `META-INF\MANIFEST.MF` file is correct. + +4. Run migration tests. + +Flink CDC tries to ensure backward compatibility of state, that is, the job state (Checkpoint/Savepoint) saved with previous CDC version should be usable in the new version. +You can run CDC migration verification locally with [Flink CDC Migration Test Utils](https://github.com/yuxiqian/migration-test) script. + +* [Pipeline Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/README.md) +* [DataStream Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/datastream/README.md) Review Comment: I would prefer putting it in CDC repository along with existing CI workflows since it seems isn't significant enough to get an independent repo. But before all I need to clean up scripts before it could be accepted by any upstream repositories. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. +You can download the RC version of the source code package and compile it using the `mvn clean package -Dfast` command, +and check if there's any unexpected errors or warnings. + +2. Verify if tarball checksum matches. + +To ensure the genuinity and integrity of released binary packages, a SHA512 hash value of the corresponding file is attached to any released binary tarball so that users can verify the integrity. +You can download the binary tarball of the RC version and calculate its SHA512 hash value with the following command: + +* Linux: `sha512sum flink-cdc-*-bin.tar.gz` +* macOS: `shasum -a 512 flink-cdc-*-bin.tar.gz` +* Windows (PowerShell): `Get-FileHash flink-cdc-*-bin.tar.gz -Algorithm SHA512 | Format-List` + +3. Verify that the binary package was compiled with JDK 8. + +Unpack the precompiled binary jar package and check if the `Build-Jdk` entry in the `META-INF\MANIFEST.MF` file is correct. + +4. Run migration tests. + +Flink CDC tries to ensure backward compatibility of state, that is, the job state (Checkpoint/Savepoint) saved with previous CDC version should be usable in the new version. +You can run CDC migration verification locally with [Flink CDC Migration Test Utils](https://github.com/yuxiqian/migration-test) script. + +* [Pipeline Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/README.md) +* [DataStream Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/datastream/README.md) Review Comment: I would prefer putting it in CDC repository along with existing CI workflows since it seems isn't significant enough to get an independent repo. But first of all I need to clean up scripts before it could be accepted by any upstream repositories. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. +You can download the RC version of the source code package and compile it using the `mvn clean package -Dfast` command, +and check if there's any unexpected errors or warnings. + +2. Verify if tarball checksum matches. + +To ensure the genuinity and integrity of released binary packages, a SHA512 hash value of the corresponding file is attached to any released binary tarball so that users can verify the integrity. +You can download the binary tarball of the RC version and calculate its SHA512 hash value with the following command: + +* Linux: `sha512sum flink-cdc-*-bin.tar.gz` +* macOS: `shasum -a 512 flink-cdc-*-bin.tar.gz` +* Windows (PowerShell): `Get-FileHash flink-cdc-*-bin.tar.gz -Algorithm SHA512 | Format-List` + +3. Verify that the binary package was compiled with JDK 8. + +Unpack the precompiled binary jar package and check if the `Build-Jdk` entry in the `META-INF\MANIFEST.MF` file is correct. + +4. Run migration tests. + +Flink CDC tries to ensure backward compatibility of state, that is, the job state (Checkpoint/Savepoint) saved with previous CDC version should be usable in the new version. +You can run CDC migration verification locally with [Flink CDC Migration Test Utils](https://github.com/yuxiqian/migration-test) script. + +* [Pipeline Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/README.md) +* [DataStream Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/datastream/README.md) Review Comment: I would prefer putting it in CDC repository along with existing CI workflows since it seems isn't significant enough to get an independent repo. I will try to clean up testing scripts before it could be accepted by any upstream repositories. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. +You can download the RC version of the source code package and compile it using the `mvn clean package -Dfast` command, +and check if there's any unexpected errors or warnings. + +2. Verify if tarball checksum matches. + +To ensure the genuinity and integrity of released binary packages, a SHA512 hash value of the corresponding file is attached to any released binary tarball so that users can verify the integrity. +You can download the binary tarball of the RC version and calculate its SHA512 hash value with the following command: + +* Linux: `sha512sum flink-cdc-*-bin.tar.gz` +* macOS: `shasum -a 512 flink-cdc-*-bin.tar.gz` +* Windows (PowerShell): `Get-FileHash flink-cdc-*-bin.tar.gz -Algorithm SHA512 | Format-List` + +3. Verify that the binary package was compiled with JDK 8. + +Unpack the precompiled binary jar package and check if the `Build-Jdk` entry in the `META-INF\MANIFEST.MF` file is correct. + +4. Run migration tests. + +Flink CDC tries to ensure backward compatibility of state, that is, the job state (Checkpoint/Savepoint) saved with previous CDC version should be usable in the new version. +You can run CDC migration verification locally with [Flink CDC Migration Test Utils](https://github.com/yuxiqian/migration-test) script. + +* [Pipeline Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/README.md) +* [DataStream Job Migration Test Guide](https://github.com/yuxiqian/migration-test/blob/main/datastream/README.md) Review Comment: I would prefer putting it in CDC repository along with existing CI workflows since it seems isn't significant enough to be an independent repo. I will try to clean up testing scripts before it could be accepted by any upstream repositories. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, Review Comment: Added link to https://www.apache.org/legal/release-policy.html. ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. +You can download the RC version of the source code package and compile it using the `mvn clean package -Dfast` command, +and check if there's any unexpected errors or warnings. + +2. Verify if tarball checksum matches. + +To ensure the genuinity and integrity of released binary packages, a SHA512 hash value of the corresponding file is attached to any released binary tarball so that users can verify the integrity. +You can download the binary tarball of the RC version and calculate its SHA512 hash value with the following command: + +* Linux: `sha512sum flink-cdc-*-bin.tar.gz` +* macOS: `shasum -a 512 flink-cdc-*-bin.tar.gz` +* Windows (PowerShell): `Get-FileHash flink-cdc-*-bin.tar.gz -Algorithm SHA512 | Format-List` + Review Comment: Thanks for @morazow's clear instruction! ########## docs/content/docs/developer-guide/contribute-to-flink-cdc.md: ########## @@ -82,3 +83,56 @@ not need a long description. 3. Are the Documentation Updated? If the pull request introduces a new feature, the feature should be documented. + +<h2 id="release-validation-guide">Release Verification Guide</h2> + +We will prepare for new releases of Flink CDC regularly. + +According to the Apache Software Foundation releasing SOP, +we will make a release candidate version before each release, +and invite community members to test and vote on this pre-release version. + +Everyone is welcomed to participate in the version verification work in `[email protected]` mailing list. +The verification content may include the following aspects: + +1. Verify if source code could be compiled successfully. + +Currently, Flink CDC uses [Maven](https://maven.apache.org/) 3 as the build tool and compiles on the JDK 8 platform. Review Comment: Seems Maven version isn't always consistent. I noticed that CI uses Maven 3.8.6 but latest 3.1.1 release was compiled with 3.9.1. Not sure if all Maven 3+ works. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
