[ 
https://issues.apache.org/jira/browse/BEAM-14408?focusedWorklogId=777196&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-777196
 ]

ASF GitHub Bot logged work on BEAM-14408:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 01/Jun/22 22:04
            Start Date: 01/Jun/22 22:04
    Worklog Time Spent: 10m 
      Work Description: TheNeuralBit commented on code in PR #17771:
URL: https://github.com/apache/beam/pull/17771#discussion_r887338327


##########
sdks/python/apache_beam/runners/worker/opcounters.py:
##########
@@ -219,9 +219,14 @@ def update_from_batch(self, windowed_batch):
     assert self.producer_batch_converter is not None
     assert isinstance(windowed_batch, windowed_value.HomogeneousWindowedBatch)
 
-    self.element_counter.update(
-        self.producer_batch_converter.get_length(windowed_batch.values))
-    # TODO(BEAM-14408): Update byte size estimate
+    batch_length = self.producer_batch_converter.get_length(
+        windowed_batch.values)
+    self.element_counter.update(batch_length)
+
+    mean_element_size = self.producer_batch_converter.estimate_byte_size(
+        windowed_batch.values) / batch_length
+    for _ in range(batch_length):
+      self.mean_byte_counter.update(mean_element_size)

Review Comment:
   Added `Counter.update_n` to address this.





Issue Time Tracking
-------------------

    Worklog Id:     (was: 777196)
    Time Spent: 1h 10m  (was: 1h)

> batch-consuming DoFns should estimate byte size
> -----------------------------------------------
>
>                 Key: BEAM-14408
>                 URL: https://issues.apache.org/jira/browse/BEAM-14408
>             Project: Beam
>          Issue Type: Sub-task
>          Components: sdk-py-core
>            Reporter: Brian Hulette
>            Assignee: Brian Hulette
>            Priority: P2
>          Time Spent: 1h 10m
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to