[
https://issues.apache.org/jira/browse/BEAM-6769?focusedWorklogId=249836&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-249836
]
ASF GitHub Bot logged work on BEAM-6769:
----------------------------------------
Author: ASF GitHub Bot
Created on: 29/May/19 05:57
Start Date: 29/May/19 05:57
Worklog Time Spent: 10m
Work Description: tvalentyn commented on pull request #8621:
[BEAM-6769][BEAM-7327] add it test for writing and reading with bigqu…
URL: https://github.com/apache/beam/pull/8621#discussion_r288402630
##########
File path: sdks/python/apache_beam/io/gcp/bigquery_tools.py
##########
@@ -998,6 +1004,13 @@ def encode(self, table_row):
# This code will catch this error to emit an error that explains
# to the programmer that they have used NAN/INF values.
try:
+ # on python 3 base64 bytes are decoded to strings before being send to bq
+ if sys.version[0] == '3':
+ if type(table_row) == str:
Review comment:
I still don't understand why/when we may receive a string here and not a
dictionary (as per docstring) during `encode()` call. The test you referenced
seems to call json.loads() before ingesting data into BQ:
https://github.com/apache/beam/blob/23115ce75c296def30c304e3b386632c270ba224/sdks/python/apache_beam/io/gcp/bigquery_file_loads_test.py#L83.
I created a PR on top of yours to see if the test pass without this change:
https://github.com/apache/beam/pull/8709, waiting for the tests to finish.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 249836)
Time Spent: 18h 40m (was: 18.5h)
> BigQuery IO does not support bytes in Python 3
> ----------------------------------------------
>
> Key: BEAM-6769
> URL: https://issues.apache.org/jira/browse/BEAM-6769
> Project: Beam
> Issue Type: Sub-task
> Components: sdk-py-core
> Reporter: Juta Staes
> Assignee: Juta Staes
> Priority: Blocker
> Fix For: 2.14.0
>
> Time Spent: 18h 40m
> Remaining Estimate: 0h
>
> In Python 2 you could write bytes data to BigQuery. This is tested in
>
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py#L186]
> Python 3 does not support
> {noformat}
> json.dumps({'test': b'test'}){noformat}
> which is used to encode the data in
>
> [https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/bigquery_tools.py#L959]
>
> How should writing bytes to BigQuery be handled in Python 3?
> * Forbid writing bytes into BigQuery on Python 3
> * Guess the encoding (utf-8?)
> * Pass the encoding to BigQuery
> cc: [~tvalentyn]
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)