[
https://issues.apache.org/jira/browse/BEAM-12865?focusedWorklogId=702196&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-702196
]
ASF GitHub Bot logged work on BEAM-12865:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 30/Dec/21 04:17
Start Date: 30/Dec/21 04:17
Worklog Time Spent: 10m
Work Description: quentin-sommer commented on pull request #15489:
URL: https://github.com/apache/beam/pull/15489#issuecomment-1002865699
No problem 👍 Yes we can close the ticket!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 702196)
Time Spent: 17.5h (was: 17h 20m)
> Allow customising batch duration when streaming with WriteToBigQuery
> --------------------------------------------------------------------
>
> Key: BEAM-12865
> URL: https://issues.apache.org/jira/browse/BEAM-12865
> Project: Beam
> Issue Type: New Feature
> Components: io-py-gcp
> Affects Versions: Not applicable
> Reporter: Quentin Sommer
> Priority: P2
> Fix For: Not applicable
>
> Time Spent: 17.5h
> Remaining Estimate: 0h
>
> Hi,
> We allow customising the {{batch_size}} when streaming to BigQuery but the
> batch duration (used by {{GroupIntoBatches}}) is set to
> {{DEFAULT_BATCH_BUFFERING_DURATION_LIMIT_SEC}} (0.2)
> I'd like to add the option to specify the batch duration to allow better
> batching for scenarios with little data throughput.
> It will use the {{triggering_frequency}} param already used when doing batch
> file loads
--
This message was sent by Atlassian Jira
(v8.20.1#820001)