Hi Arvid,

I have some bandwidth to contribute to this task and am familiar with the code. 
Could you or another committer assign me this ticket?

Thanks,
Mason

> On Oct 30, 2021, at 5:24 AM, Arvid Heise <ar...@apache.org> wrote:
> 
> Hi Mason,
> 
> thanks for creating that.
> 
> We are happy to take contribuitons (I flagged it as a starter task).
> 
> On Wed, Oct 27, 2021 at 2:36 AM Mason Chen <mason.c...@apple.com 
> <mailto:mason.c...@apple.com>> wrote:
> Hi all,
> 
> I have a similar requirement to Preston. I created 
> https://issues.apache.org/jira/browse/FLINK-24660 
> <https://issues.apache.org/jira/browse/FLINK-24660> to track this effort.
> 
> Best,
> Mason
> 
>> On Oct 18, 2021, at 1:59 AM, Arvid Heise <ar...@apache.org 
>> <mailto:ar...@apache.org>> wrote:
>> 
>> Hi Preston,
>> 
>> if you still need to set KafkaSubscriber explicitly, could you please create 
>> a feature request for that? For now, you probably have to resort to 
>> reflection hacks and build against a the non-public KafkaSubscriber.
>> 
>> On Fri, Oct 15, 2021 at 4:03 AM Prasanna kumar 
>> <prasannakumarram...@gmail.com <mailto:prasannakumarram...@gmail.com>> wrote:
>> Yes you are right.
>> 
>> We tested recently to find that the flink jobs do not pick up the new topics 
>> that got created with the same pattern provided to flink kafka consumer.  
>> The topics are set only during the start of the jobs. 
>> 
>> Prasanna.
>> 
>> On Fri, 15 Oct 2021, 05:44 Preston Price, <nacro...@gmail.com 
>> <mailto:nacro...@gmail.com>> wrote:
>> Okay so topic discovery is possible with topic patterns, and maybe topic 
>> lists. However I don't believe it's possible to change the configured topic 
>> list, or topic pattern after the source is created.
>> 
>> On Thu, Oct 14, 2021, 3:52 PM Denis Nutiu <denis.nu...@gmail.com 
>> <mailto:denis.nu...@gmail.com>> wrote:
>> There is a setting for dynamic topic discovery 
>> https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/connectors/table/kafka/#topic-and-partition-discovery
>>  
>> <https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/connectors/table/kafka/#topic-and-partition-discovery>
>> Best,
>> 
>> Denis
>> 
>> 
>> On Fri, Oct 15, 2021 at 12:48 AM Denis Nutiu <denis.nu...@gmail.com 
>> <mailto:denis.nu...@gmail.com>> wrote:
>> Hi,
>> 
>> In my experience with the librdkafka client and the Go wrapper, the 
>> topic-pattern subscribe is reactive. The Flink Kafka connector might behave 
>> similarly. 
>> 
>> Best,
>> Denis
>> 
>> On Fri, Oct 15, 2021 at 12:34 AM Preston Price <nacro...@gmail.com 
>> <mailto:nacro...@gmail.com>> wrote:
>> No, the topic-pattern won't work for my case. Topics that I should subscribe 
>> to can be enabled/disabled based on settings I read from another system, so 
>> there's no way to craft a single regular expression that would fit the state 
>> of all potential topics. Additionally the documentation you linked seems to 
>> suggest that the regular expression is evaluated only once "when the job 
>> starts running". My understanding is it would not pick up new topics that 
>> match the pattern after the job starts.
>> 
>> 
>> On Wed, Oct 13, 2021 at 8:51 PM Caizhi Weng <tsreape...@gmail.com 
>> <mailto:tsreape...@gmail.com>> wrote:
>> Hi!
>> 
>> I suppose you want to read from different topics every now and then? Does 
>> the topic-pattern option [1] in Table API Kafka connector meet your needs?
>> 
>> [1] 
>> https://ci.apache.org/projects/flink/flink-docs-master/docs/connectors/table/kafka/#topic-pattern
>>  
>> <https://ci.apache.org/projects/flink/flink-docs-master/docs/connectors/table/kafka/#topic-pattern>
>> Preston Price <nacro...@gmail.com <mailto:nacro...@gmail.com>> 
>> 于2021年10月14日周四 上午1:34写道:
>> The KafkaSource, and KafkaSourceBuilder appear to prevent users from 
>> providing their own KafkaSubscriber. Am I overlooking something?
>> 
>> In my case I have an external system that controls which topics we should be 
>> ingesting, and it can change over time. I need to add, and remove topics as 
>> we refresh configuration from this external system without having to stop 
>> and start our Flink job. Initially it appeared I could accomplish this by 
>> providing my own implementation of the `KafkaSubscriber` interface, which 
>> would be invoked periodically as configured by the 
>> `partition.discovery.interval.ms <http://partition.discovery.interval.ms/>` 
>> property. However there is no way to provide my implementation to the 
>> KafkaSource since the constructor for KafkaSource is package protected, and 
>> the KafkaSourceBuilder does not supply a way to provide the 
>> `KafkaSubscriber`.
>> 
>> How can I accomplish a period refresh of the topics to ingest?
>> 
>> Thanks
>> 
>> 
>> 
>> 
>> -- 
>> Regards,
>> Denis Nutiu
>> 
>> 
>> -- 
>> Regards,
>> Denis Nutiu
> 

Reply via email to