[ 
https://issues.apache.org/jira/browse/BEAM-6769?focusedWorklogId=250221&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-250221
 ]

ASF GitHub Bot logged work on BEAM-6769:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 29/May/19 15:40
            Start Date: 29/May/19 15:40
    Worklog Time Spent: 10m 
      Work Description: tvalentyn commented on pull request #8621: 
[BEAM-6769][BEAM-7327] add it test for writing and reading with bigqu…
URL: https://github.com/apache/beam/pull/8621#discussion_r288634752
 
 

 ##########
 File path: sdks/python/apache_beam/io/gcp/bigquery_tools.py
 ##########
 @@ -998,6 +1004,13 @@ def encode(self, table_row):
     # This code will catch this error to emit an error that explains
     # to the programmer that they have used NAN/INF values.
     try:
+      # on python 3 base64 bytes are decoded to strings before being send to bq
+      if sys.version[0] == '3':
+        if type(table_row) == str:
 
 Review comment:
   I think that the test needs to be fixed to pass dicts or BQ IO file loads 
method needs to use a different codec, not a codec that expects dicts. 
   I suggest we change the test to pass dicts.  @pabloem, @chamikaramj - do you 
have a different opinion on this?
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 250221)
    Time Spent: 19.5h  (was: 19h 20m)

> BigQuery IO does not support bytes in Python 3
> ----------------------------------------------
>
>                 Key: BEAM-6769
>                 URL: https://issues.apache.org/jira/browse/BEAM-6769
>             Project: Beam
>          Issue Type: Sub-task
>          Components: sdk-py-core
>            Reporter: Juta Staes
>            Assignee: Juta Staes
>            Priority: Blocker
>             Fix For: 2.14.0
>
>          Time Spent: 19.5h
>  Remaining Estimate: 0h
>
> In Python 2 you could write bytes data to BigQuery. This is tested in
>  
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py#L186]
> Python 3 does not support
> {noformat}
> json.dumps({'test': b'test'}){noformat}
> which is used to encode the data in
>  
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/bigquery_tools.py#L959]
>  
> How should writing bytes to BigQuery be handled in Python 3?
>  * Forbid writing bytes into BigQuery on Python 3
>  * Guess the encoding (utf-8?)
>  * Pass the encoding to BigQuery
> cc: [~tvalentyn]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to