[
https://issues.apache.org/jira/browse/FLINK-25509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17708755#comment-17708755
]
Martijn Visser commented on FLINK-25509:
----------------------------------------
[~lindong] Yes, it was one of the reasons for externalization. There's
https://cwiki.apache.org/confluence/display/FLINK/Externalized+Connector+development
for all the info, including discussion threads, like
https://lists.apache.org/thread/bywh947r2f5hfocxq598zhyh06zhksrm. That thread
was also a driver for FLIP-196
https://cwiki.apache.org/confluence/display/FLINK/FLIP-196%3A+Source+API+stability+guarantees
and FLIP-197
https://cwiki.apache.org/confluence/display/FLINK/FLIP-197%3A+API+stability+graduation+process.
I believe this situation is similar to
https://issues.apache.org/jira/browse/FLINK-31324 which was fixed for 1.17.
In the end, it's up to the maintainer of a connector with how the minimum of
two versions are supported as outlined on
https://cwiki.apache.org/confluence/display/FLINK/Externalized+Connector+development.
There are multiple ways documented there.
> FLIP-208: Add RecordEvaluator to dynamically stop source based on
> de-serialized records
> ---------------------------------------------------------------------------------------
>
> Key: FLINK-25509
> URL: https://issues.apache.org/jira/browse/FLINK-25509
> Project: Flink
> Issue Type: New Feature
> Components: Connectors / Common, Connectors / Kafka
> Reporter: Dong Lin
> Assignee: Hang Ruan
> Priority: Major
> Labels: pull-request-available
>
> This feature is needed to migrate applications which uses
> KafkaDeserializationSchema::isEndOfStream() from using FlinkKafkaConsumer to
> using KafkaSource.
> Please checkout
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-208%3A+Add+RecordEvaluator+to+dynamically+stop+source+based+on+de-serialized+records
> for the motivation and the proposed changes.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)