BhuviTheDataGuy opened a new issue #11660:
URL: https://github.com/apache/airflow/issues/11660


   I was using `GoogleCloudStorageToBigQueryOperator` then I wanted to use 
`GCSToBigQueryOperator`. When I run parallel data export from GCS to BQ, (via a 
for loop Im generating dynamic task) It is generating the BQ Job name as 
`test-composer:us-west2.airflow_1603109319` (I think its taking node name + 
current timestamp) as the job id for all the tasks. 
   
   **Error**
   ```
   ERROR - 409 POST 
https://bigquery.googleapis.com/bigquery/v2/projects/centili-prod/jobs: Already 
Exists: Job test-composer:us-west2.airflow_1603109319
   Traceback (most recent call last)
   ```
   This is not allowing to import 2nd table, it has to wait for a min(retry in 
DAG) then its imported.
   
   But the older one is giving proper Job ID like (Job_someUUID)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to