pabloem commented on code in PR #17431:
URL: https://github.com/apache/beam/pull/17431#discussion_r861280210
##########
sdks/python/apache_beam/io/gcp/bigquery.py:
##########
@@ -2190,6 +2190,20 @@ def expand(self, pcoll):
'A schema must be provided when writing to BigQuery using '
'Avro based file loads')
+ if self.schema and type(self.schema) is dict:
+
+ def find_in_nested_dict(schema):
+ for field in schema['fields']:
+ if field['type'] == 'JSON':
+ raise ValueError(
+ 'Found JSON type in table schema. JSON data '
+ 'insertion is currently not supported with '
+ 'FILE_LOADS write method.')
+ elif field['type'] == 'STRUCT':
+ find_in_nested_dict(field)
Review Comment:
so this is not supported for AVRO nor JSON file loads. Is that correct? Can
you perhaps post a reference to BQ docs in the `ValueError`? This would make it
easiest for customers to check when this issue surfaces.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]