lanshiqin commented on code in PR #13584:
URL: https://github.com/apache/kafka/pull/13584#discussion_r1191984789


##########
core/src/test/scala/unit/kafka/log/LogSegmentTest.scala:
##########
@@ -65,6 +68,32 @@ class LogSegmentTest {
     Utils.delete(logDir)
   }
 
+  /**
+   * LogSegmentOffsetOverflowException should be thrown while appending the 
logs if:
+   * 1. largestOffset < 0
+   * 2. largestOffset - baseOffset < 0
+   * 3. largestOffset - baseOffset > Integer.MAX_VALUE
+   */
+  @ParameterizedTest
+  @CsvSource(Array(
+    "0, -2147483648",
+    "0, 2147483648",
+    "1, 0",
+    "100, 10",
+    "2147483648, 0",
+    "-2147483648, 0",
+    "2147483648,4294967296"
+  ))
+  def testAppendForLogSegmentOffsetOverflowException(baseOffset: Long, 
largestOffset: Long): Unit = {

Review Comment:
   Sorry, I need to correct that. 
   When baseOffset == largestOffset, LogSegment.append executes normally 
without throwing an exception. Because my unit test of parameters 
shallowOffsetOfMaxTimestamp value is 0, which can lead to rear unit test 
processing, TimeIndex# maybeAppend: In a `relativeOffset(offset),` the value of 
offset is 0. If baseOffset is a value greater than 0, the result will be <0, 
resulting in an exception.
   I have repaired the shallowOffsetOfMaxTimestamp value of this test.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to