Hi Flink Experts,

how to achieve at least once semantics with FlinkDynamoDBStreamsConsumer +
DynamoDB Streams ? Flink checkpointing or save points do the job?

My Scenario:-
Flink application uses FlinkDynamoDBStreamsConsumer which reads latest
changes from DynamoDB streams but if my software fails and the application
was down for 30 minutes due to some reason can Flink application read
missing changes from past 30 minutes from Dyn stream?

Is it better to write dyn stream changes to kafka or kinesis topic and ask
flink application to read from kafka or kinesis topic which will let's
flink application to read from last successful record using Kafka offsets
for example?

What would be the best architecture to move forward?

-- 
Thanks & Regards
Sri Tummala

Reply via email to