nbali commented on code in PR #22953:
URL: https://github.com/apache/beam/pull/22953#discussion_r963987206


##########
sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BatchLoads.java:
##########
@@ -113,6 +113,9 @@
   // If user triggering is supplied, we will trigger the file write after this 
many records are
   // written.
   static final int FILE_TRIGGERING_RECORD_COUNT = 500000;
+  // If user triggering is supplied, we will trigger the file write after this 
many bytes are
+  // written.
+  static final long FILE_TRIGGERING_BYTE_COUNT = 100 * (1L << 20); // 100MiB

Review Comment:
   @lukecwik Having the same limit as a buffer actually makes sense to me, but 
can you direct me towards where might I find that limit? I can see it in the 
comments for `DEFAULT_MAX_NUM_WRITERS_PER_BUNDLE`, but instead of hardcoding 
64MB here as well, I would rather reference the original limit directly.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to