svetakvsundhar commented on code in PR #17159:
URL: https://github.com/apache/beam/pull/17159#discussion_r870886625
##########
sdks/python/apache_beam/io/gcp/bigquery.py:
##########
@@ -2525,6 +2526,12 @@ def _get_pipeline_details(unused_elm):
**self._kwargs))
| _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))
+ def get_pcoll_from_schema(table_schema):
+ pcoll_val = apache_beam.io.gcp.bigquery_schema_tools.\
+ produce_pcoll_with_schema(table_schema)
+ return beam.Map(lambda values: pcoll_val(**values)).with_output_types(
Review Comment:
@TheNeuralBit just to follow up here, it looks like ```rowcoder``` was
indeed the coder being used. I verified that the asserts passed here
https://github.com/svetakvsundhar/beam/blob/bqio/sdks/python/apache_beam/coders/row_coder_test.py#L263,
and they both proved to be using rowcoder.
Are there any other intermediate methods I can run on our ```usertype``` to
verify during the rowcoder ptransform? One thing I noted was that
```usertype``` doesn't have a ```.schema``` property, and that I wasn't able to
verify if the argument of this ptransform is a schema
(```apache_beam.portability.api.schema_pb2.Schema```). Is there an easy way to
verify that the input is indeed of that type?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]