[
https://issues.apache.org/jira/browse/BEAM-6769?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16793232#comment-16793232
]
Chamikara Jayalath commented on BEAM-6769:
------------------------------------------
Thanks for the comments and the PR.
Looks like, for Python2, writing raw bytes to BQ currently requires a schema.
We should document this and try to fix this.
How does it work when providing a schema ? Do we base64 encode bytes before
calling json.dumps() somewhere ?
I think, for Python3 we should do following.
Allow users to pass raw bytes or strings(unicode).
Unicode values have to be UTF-8 encoded as required by BQ:
[https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#string-type]
Both may have to be base64 encoded if required by json.dumps().
WDYT ?
> BigQuery IO does not support bytes in Python 3
> ----------------------------------------------
>
> Key: BEAM-6769
> URL: https://issues.apache.org/jira/browse/BEAM-6769
> Project: Beam
> Issue Type: Sub-task
> Components: sdk-py-core
> Reporter: Juta Staes
> Assignee: Juta Staes
> Priority: Major
> Time Spent: 2.5h
> Remaining Estimate: 0h
>
> In Python 2 you could write bytes data to BigQuery. This is tested in
>
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py#L186]
> Python 3 does not support
> {noformat}
> json.dumps({'test': b'test'}){noformat}
> which is used to encode the data in
>
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/bigquery_tools.py#L959]
>
> How should writing bytes to BigQuery be handled in Python 3?
> * Forbid writing bytes into BigQuery on Python 3
> * Guess the encoding (utf-8?)
> * Pass the encoding to BigQuery
> cc: [~tvalentyn]
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)