PatrickRen commented on a change in pull request #16590:
URL: https://github.com/apache/flink/pull/16590#discussion_r688258957
##########
File path:
flink-connectors/flink-connector-kafka/src/test/java/org/apache/flink/connector/kafka/source/KafkaSourceITCase.java
##########
@@ -26,6 +26,15 @@
import org.apache.flink.configuration.Configuration;
import
org.apache.flink.connector.kafka.source.enumerator.initializer.OffsetsInitializer;
import
org.apache.flink.connector.kafka.source.reader.deserializer.KafkaRecordDeserializationSchema;
+import
org.apache.flink.connector.kafka.source.testutils.KafkaMultipleTopicExternalContext;
Review comment:
The existing KafkaSourceITCase uses ```@BeforeAll``` and ```@AfterAll```
to startup a in-memory Kafka broker. If I only nest the testing framework
cases, that in-memory Kafka broker will be running until all cases including
testing framework cases finish.
So I use the ```KafkaSpecificTests``` nested class to wrap all existing IT
cases and destroy the in-memory Kafka before running testing framework cases.
"KafkaSpecific" means these cases are testing Kafka source features like start
timestamp, serializer etc.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]