cozos commented on issue #22986:
URL: https://github.com/apache/beam/issues/22986#issuecomment-1235481552

   Hi @brucearctor thanks for the suggestion. Upon further investigation, I've 
found out that for me (who uses `FileFormat.AVRO`), 
`WriteToBigQuery/BigQueryBatchFileLoads` transform fails when writing the 
temporary file (typically due to schema mismatch).
   
   So, like you mentioned, the `max_bad_records` is a workaround for 
`FileFormat.JSON`, but unfortunately doesn't even get to the `LoadJob` step in 
Avro format.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to