[
https://issues.apache.org/jira/browse/BEAM-12865?focusedWorklogId=700351&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-700351
]
ASF GitHub Bot logged work on BEAM-12865:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 23/Dec/21 00:15
Start Date: 23/Dec/21 00:15
Worklog Time Spent: 10m
Work Description: emilymye commented on pull request #16315:
URL: https://github.com/apache/beam/pull/16315#issuecomment-999954073
I think we're safe to ignore the postcommit error for now - Pablo didn't
seem to have additional comments so I'm good with reviewing and merging for now
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 700351)
Time Spent: 16h 50m (was: 16h 40m)
> Allow customising batch duration when streaming with WriteToBigQuery
> --------------------------------------------------------------------
>
> Key: BEAM-12865
> URL: https://issues.apache.org/jira/browse/BEAM-12865
> Project: Beam
> Issue Type: New Feature
> Components: io-py-gcp
> Affects Versions: Not applicable
> Reporter: Quentin Sommer
> Priority: P2
> Fix For: Not applicable
>
> Time Spent: 16h 50m
> Remaining Estimate: 0h
>
> Hi,
> We allow customising the {{batch_size}} when streaming to BigQuery but the
> batch duration (used by {{GroupIntoBatches}}) is set to
> {{DEFAULT_BATCH_BUFFERING_DURATION_LIMIT_SEC}} (0.2)
> I'd like to add the option to specify the batch duration to allow better
> batching for scenarios with little data throughput.
> It will use the {{triggering_frequency}} param already used when doing batch
> file loads
--
This message was sent by Atlassian Jira
(v8.20.1#820001)