[
https://issues.apache.org/jira/browse/BEAM-11521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17288498#comment-17288498
]
Beam JIRA Bot commented on BEAM-11521:
--------------------------------------
This issue is P2 but has been unassigned without any comment for 60 days so it
has been labeled "stale-P2". If this issue is still affecting you, we care!
Please comment and remove the label. Otherwise, in 14 days the issue will be
moved to P3.
Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed
explanation of what these priorities mean.
> BigQuery: add maxStreamingBatchSize for Python streaming inserts
> ----------------------------------------------------------------
>
> Key: BEAM-11521
> URL: https://issues.apache.org/jira/browse/BEAM-11521
> Project: Beam
> Issue Type: Bug
> Components: io-py-gcp
> Reporter: Udi Meiri
> Priority: P2
> Labels: stale-P2
>
> Java SDK has maxStreamingBatchSize.
> Implementing something similar for Python would prevent 400 HTTP errors from
> BigQuery when batches are too large. Instead, Beam should try to split
> batches and log a warning (once) when trying to insert a single very large
> row.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)