zhoufek commented on a change in pull request #15603:
URL: https://github.com/apache/beam/pull/15603#discussion_r718838778
##########
File path: sdks/python/apache_beam/transforms/trigger.py
##########
@@ -161,12 +159,42 @@ def with_prefix(self, prefix):
class DataLossReason(Flag):
- """Enum defining potential reasons that a trigger may cause data loss."""
+ """Enum defining potential reasons that a trigger may cause data loss.
+
+ These flags should only cover when the trigger is the cause, though windowing
+ can be taken into account. For instance, AfterWatermark may not flag itself
+ as finishing if the windowing doesn't allow lateness.
+ """
+
+ # Trigger will never be the source of data loss.
NO_POTENTIAL_LOSS = 0
+
+ # Trigger may finish. In this case, data that comes in after the trigger may
+ # be lost. Example: AfterCount(1) will stop firing after the first element.
MAY_FINISH = auto()
+
+ # Trigger has a condition that is not guaranteed to ever be met. In this
+ # case, data that comes in may be lost. Example: AfterCount(42) will lose
+ # 20 records if only 20 come in, since the condition to fire was never met.
CONDITION_NOT_GUARANTEED = auto()
Review comment:
Ok, so what _should_ be happening in the test is that the four elements
in the global window should be emitted at GC time, and GBK should process them
before the pipeline finishes, but a bug in the direct runner stops those four
elements from emitting. Is that right?
If so, I'm guessing I should remove this as a potential data loss reason. It
was in there because of the comment in the documentation and because the test
seemed to confirm it.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]