jd185367 commented on issue #25946:
URL: https://github.com/apache/beam/issues/25946#issuecomment-2422781348

   I'd also like to bump this as needed for using `WriteToBigQuery` in Python:
   
   
https://github.com/apache/beam/blob/fa9eb2fe17f5f96b40275fe7b0a3981f4a52e0df/sdks/python/apache_beam/io/gcp/bigquery.py#L1887
   
   Google recommends using the `STORAGE_WRITE_API` method in their [Dataflow 
Best 
Practices](https://cloud.google.com/dataflow/docs/guides/write-to-bigquery#best-practices-streaming),
 which requires passing this transform the `schema` argument for a table. But 
since many of our BigQuery tables have a `DATE` or `DATETIME` column, which 
isn't supported yet for these schemas in Python, we aren't able to use this.
   
   > As of Beam 2.60.0, we haven't found a current workaround - e.g. specifying 
our `DATE` columns as `TIMESTAMP` in the Python schema seems to fail either 
when Beam tries to actually write to BigQuery, or at some point when the Java 
code is executing and doing its own conversion. If anyone knows a workaround 
for this, I'd appreciate it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to