Copilot commented on code in PR #4296:
URL: https://github.com/apache/flink-cdc/pull/4296#discussion_r2877609532


##########
flink-cdc-connect/flink-cdc-source-connectors/flink-connector-mysql-cdc/src/main/java/org/apache/flink/cdc/connectors/mysql/source/assigners/MySqlChunkSplitter.java:
##########
@@ -334,6 +334,9 @@ private Object nextChunkEnd(
         Object chunkEnd =
                 StatementUtils.queryNextChunkMax(
                         jdbc, tableId, splitColumnName, chunkSize, 
previousChunkEnd);
+        if (chunkEnd == null) {
+            return null;
+        }

Review Comment:
   Consider adding a regression test that covers the scenario fixed here: 
`queryNextChunkMax(...)` returning `null` (e.g., max rows deleted between the 
earlier MIN/MAX query and subsequent chunk-end queries). This change prevents 
an NPE, but without a test it’s easy for the null-handling path in 
`nextChunkEnd` to regress again. A unit test could stub the JDBC/StatementUtils 
interaction, or an ITCase could delete the max row after MIN/MAX is determined 
and assert chunk splitting completes without exceptions.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to