[ 
https://issues.apache.org/jira/browse/BEAM-12865?focusedWorklogId=700146&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-700146
 ]

ASF GitHub Bot logged work on BEAM-12865:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 22/Dec/21 17:33
            Start Date: 22/Dec/21 17:33
    Worklog Time Spent: 10m 
      Work Description: emilymye commented on pull request #16315:
URL: https://github.com/apache/beam/pull/16315#issuecomment-999749588


   Run Python 3.7 PostCommit


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 700146)
    Time Spent: 16h  (was: 15h 50m)

> Allow customising batch duration when streaming with WriteToBigQuery
> --------------------------------------------------------------------
>
>                 Key: BEAM-12865
>                 URL: https://issues.apache.org/jira/browse/BEAM-12865
>             Project: Beam
>          Issue Type: New Feature
>          Components: io-py-gcp
>    Affects Versions: Not applicable
>            Reporter: Quentin Sommer
>            Priority: P2
>             Fix For: Not applicable
>
>          Time Spent: 16h
>  Remaining Estimate: 0h
>
> Hi,
> We allow customising the {{batch_size}} when streaming to BigQuery but the 
> batch duration (used by {{GroupIntoBatches}}) is set to 
> {{DEFAULT_BATCH_BUFFERING_DURATION_LIMIT_SEC}} (0.2)
> I'd like to add the option to specify the batch duration to allow better 
> batching for scenarios with little data throughput.
> It will use the {{triggering_frequency}} param already used when doing batch 
> file loads



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to