Hi guys. Good morning. I haven't done some test in apache beam over data flow in order to see if i can do an hot update or hot swap meanwhile the pipeline is processing a bunch of messages that fall in a time window of 10 minutes. What I saw is that when I do a hot update over the pipeline and currently there are some messages in the time window (before sending them to the target), the current job is shutdown and dataflow creates a new one. The problem is that it seems that I am losing the messages that were being processed in the old one and they are not taken by the new one, which imply we are incurring in losing data .
Can you help me or recommend any strategy to me? Thanks!!
