[
https://issues.apache.org/jira/browse/BEAM-8841?focusedWorklogId=394549&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-394549
]
ASF GitHub Bot logged work on BEAM-8841:
----------------------------------------
Author: ASF GitHub Bot
Created on: 28/Feb/20 00:08
Start Date: 28/Feb/20 00:08
Worklog Time Spent: 10m
Work Description: chunyang commented on pull request #10979: [BEAM-8841]
Support writing data to BigQuery via Avro in Python SDK
URL: https://github.com/apache/beam/pull/10979#discussion_r385442046
##########
File path: sdks/python/apache_beam/io/gcp/bigquery.py
##########
@@ -1361,87 +1369,18 @@ def __init__(
self.triggering_frequency = triggering_frequency
self.insert_retry_strategy = insert_retry_strategy
self._validate = validate
+ self._temp_file_format = temp_file_format or bigquery_tools.FileFormat.JSON
Review comment:
AFAICT using Avro has no disadvantages compared to JSON for loading data
into BigQuery, but would requiring a schema constitute a breaking API change
for semantic versioning purposes?
Personally I'm for using Avro as default. I guess when users update Beam,
they'll specify a `temp_file_format` explicitly to get the old behavior.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 394549)
Time Spent: 3h (was: 2h 50m)
> Add ability to perform BigQuery file loads using avro
> -----------------------------------------------------
>
> Key: BEAM-8841
> URL: https://issues.apache.org/jira/browse/BEAM-8841
> Project: Beam
> Issue Type: Improvement
> Components: io-py-gcp
> Reporter: Chun Yang
> Assignee: Chun Yang
> Priority: Minor
> Time Spent: 3h
> Remaining Estimate: 0h
>
> Currently, JSON format is used for file loads into BigQuery in the Python
> SDK. JSON has some disadvantages including size of serialized data and
> inability to represent NaN and infinity float values.
> BigQuery supports loading files in avro format, which can overcome these
> disadvantages. The Java SDK already supports loading files using avro format
> (BEAM-2879) so it makes sense to support it in the Python SDK as well.
> The change will be somewhere aroundÂ
> [{{BigQueryBatchFileLoads}}|https://github.com/apache/beam/blob/3e7865ee6c6a56e51199515ec5b4b16de1ddd166/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py#L554].
--
This message was sent by Atlassian Jira
(v8.3.4#803005)