[ 
https://issues.apache.org/jira/browse/BEAM-6769?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16793809#comment-16793809
 ] 

Chamikara Jayalath commented on BEAM-6769:
------------------------------------------

"Why should unicode values be base64 encoded?" we don't have to if json.dump() 
can handle UTF-8 encoded unicode and BigQuery accept and store that correctly 
in a String column.

 

"I think we should require specifying a schema" So when writing, either user 
specifies the schema or the table is already available. If table is already 
available we can find the schema through an API call. Either way the schema 
should be accessible when writing. So I don't think we have to require user to 
pass the schema (for existing tables) or pre base64 encode bytes data (we can 
do it ourselves based on the schema).

> BigQuery IO does not support bytes in Python 3
> ----------------------------------------------
>
>                 Key: BEAM-6769
>                 URL: https://issues.apache.org/jira/browse/BEAM-6769
>             Project: Beam
>          Issue Type: Sub-task
>          Components: sdk-py-core
>            Reporter: Juta Staes
>            Assignee: Juta Staes
>            Priority: Major
>          Time Spent: 3h
>  Remaining Estimate: 0h
>
> In Python 2 you could write bytes data to BigQuery. This is tested in
>  
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py#L186]
> Python 3 does not support
> {noformat}
> json.dumps({'test': b'test'}){noformat}
> which is used to encode the data in
>  
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/bigquery_tools.py#L959]
>  
> How should writing bytes to BigQuery be handled in Python 3?
>  * Forbid writing bytes into BigQuery on Python 3
>  * Guess the encoding (utf-8?)
>  * Pass the encoding to BigQuery
> cc: [~tvalentyn]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to