Abacn commented on issue #28219: URL: https://github.com/apache/beam/issues/28219#issuecomment-1701426481
> > Either use legacy runner (with streaming engine enabled); or under Dataflow runner v2 and manually set sharding `BigQueryIO.write().withNumFileShards(1)` or any positive integer would mitigate this issue. > > I did try using `withNumFileShards` still see the same problem causing the same load job-id You're right, sorry, I messed up different test pipelines. Confirm that setting withNumFileShards issue still exists. The problem is not need GroupIntoBatches to trigger -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
