pabloem commented on a change in pull request #12761:
URL: https://github.com/apache/beam/pull/12761#discussion_r483220143
##########
File path: sdks/python/apache_beam/io/gcp/bigquery_file_loads.py
##########
@@ -356,8 +359,13 @@ def process(self, element, job_name_prefix=None):
copy_from_reference.projectId = vp.RuntimeValueProvider.get_value(
'project', str, '')
- copy_job_name = '%s_%s' % (
+ copy_job_name = '%s_%s_%s' % (
job_name_prefix,
+ _bq_uuid(
+ '%s:%s.%s' % (
Review comment:
Addressing the q about why this did not work - the BQ job names are
generated by joining a prefix with the hashof a table name. COPY jobs are
necessary when we write to multiple temporary tables, and copy to a single
destination table.
Because we were hashing the destination table name, ALL the copy jobs had
the same destination, so they had the same hash, and thus the same job name -
and this would cause a failure.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]