kennknowles commented on PR #16773:
URL: https://github.com/apache/beam/pull/16773#issuecomment-1092275275

   I looked into this, and I think the right fix is to have a separate 
`ReadBoundedFromKafkaDoFn` that indicates statically that it is bounded per 
element. Then the proper behavior of "multiplying" the input cardinality by the 
DoFn's per-element cardinality can work as intended.
   
   I am less familiar with Kafka's details, but would it also be possible that 
the event timestamp / watermark on the topic does not advance? In that case we 
would want the above change (which is safe) along with this experiment, which 
should be renamed to something like `unsafely_treat_pcollections_as_bounded`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@beam.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to