Re: What is the best way to have a cache of an external database in Flink?

2021-01-22 Thread Selvaraj chennappan
Hi, Perhaps broadcast state is natural fit for this scenario. https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/stream/state/broadcast_state.html Thanks, Selvaraj C On Fri, 22 Jan 2021 at 8:45 PM, Kumar Bolar, Harshith wrote: > Hi all, > > The external database consists of a

update the existing Keyed value state

2019-05-03 Thread Selvaraj chennappan
Hi Users, We want to have a real time aggregation (KPI) . we are maintaining aggregation counters in the keyed value state . key could be customer activation date and type. Lot of counters are maintained against that key. If we want to add one more counter for the existing keys which is in the

Re: Flink Standalone cluster - logging problem

2019-02-11 Thread Selvaraj chennappan
Could you pls try modifying conf/logback.xml . Regards, Selvaraj C On Mon, Feb 11, 2019 at 4:32 PM simpleusr wrote: > Hi Gary, > > By "job logs" I mean all the loggers under a subpackage of > com.mycompany.xyz > . > > We are using ./bin/flink run command for job execution thats why I modified

Re: getting duplicate messages from duplicate jobs

2019-01-30 Thread Selvaraj chennappan
I have faced same problem . https://stackoverflow.com/questions/54286486/two-kafka-consumer-in-same-group-and-one-partition On Wed, Jan 30, 2019 at 6:11 PM Avi Levi wrote: > Ok, if you guys think it's should be like that then so be it. All I am > saying is that it is not standard behaviour

Re: Forking a stream with Flink

2019-01-29 Thread Selvaraj chennappan
com> wrote: > Hi Selvaraj > > In your pojo add data member as status or something like that,now set it > error in case it is invaild .pass the output of flatmap > to split opertor there you can split the stream > > On Tue, Jan 29, 2019 at 6:39 PM Selvaraj chennappan <

Re: connecting two streams flink

2019-01-29 Thread Selvaraj chennappan
fast and other(c1) > > > > > > On Tue, Jan 29, 2019 at 2:44 PM Selvaraj chennappan < > selvarajchennap...@gmail.com> wrote: > >> Team, >> >> I have two kafka consumer for same topic and want to join second stream >> to first after couple of subtasks computatio

Re: Forking a stream with Flink

2019-01-29 Thread Selvaraj chennappan
UseCase:- We have kafka consumer to read messages(json ) then it applies to flatmap for transformation based on the rules ( rules are complex ) and convert it to pojo . We want to verify the record(pojo) is valid by checking field by field of that record .if record is invalid due to

connecting two streams flink

2019-01-29 Thread Selvaraj chennappan
Team, I have two kafka consumer for same topic and want to join second stream to first after couple of subtasks computation in the first stream then validate the record . KT - C1 ,C2 KT - C1 - Transformation(FlatMap) - Dedup - Validate --ifvalidsave it to DB -C2 - Process