david-loveholidays opened a new issue, #24729:
URL: https://github.com/apache/airflow/issues/24729

   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-google==2022.6.22+composer
   
   ### Apache Airflow version
   
   2.2.5
   
   ### Operating System
   
   Google Cloud Composer running on GCP Kubernetes
   
   ### Deployment
   
   Composer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   BigQueryToBigQueryOperator failed to copy a table.
   Code:
   ```
       overwrite_sales = BigQueryToBigQueryOperator(
           task_id="overwrite_sales",
           source_project_dataset_tables=sales_gbp_refresh_table,
           destination_project_dataset_table=sales_table,
           write_disposition="WRITE_TRUNCATE",
           retries=2,
       )
   ```
   Logs:
   ```
   [2022-06-28, 22:26:28 UTC] {taskinstance.py:1776} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/transfers/bigquery_to_bigquery.py",
 line 144, in execute
       job = hook.get_job(job_id=job_id).to_api_repr()
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/common/hooks/base_google.py",
 line 439, in inner_wrapper
       return func(self, *args, **kwargs)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py",
 line 1494, in get_job
       job = client.get_job(job_id=job_id, project=project_id, 
location=location)
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/cloud/bigquery/client.py", 
line 2066, in get_job
       resource = self._call_api(
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/cloud/bigquery/client.py", 
line 782, in _call_api
       return call()
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/api_core/retry.py", line 
283, in retry_wrapped_func
       return retry_target(
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/api_core/retry.py", line 
190, in retry_target
       return target()
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/cloud/_http/__init__.py", 
line 494, in api_request
       raise exceptions.from_http_response(response)
   google.api_core.exceptions.NotFound: 404 GET 
https://bigquery.googleapis.com/bigquery/v2/projects/...?projection=full&prettyPrint=false:
 Not found: Job ...:airflow_...```
   @david-loveholidays  Write a reply
    
   Suggest an answer
   No file chosen
   Attach files by dragging & dropping, selecting or pasting them.
   Remember, contributions to this repository should follow its [contributing 
guidelines](https://github.com/apache/airflow/blob/c118b2836f7211a0c3762cff8634b7b9a0d1cf0b/CONTRIBUTING.rst)
 and [code of 
conduct](https://github.com/apache/airflow/blob/c118b2836f7211a0c3762cff8634b7b9a0d1cf0b/CODE_OF_CONDUCT.md).
   Category
   🙏
   Q&A
   Labels
   None yet
   5 participants
   @gmyrianthous
   @manfredi-giordano
   @potiuk
   @shuhoy
   @ACowanCW
   Notifications
   You’re not receiving notifications from this thread.
   [ Create issue from 
discussion](https://github.com/apache/airflow/issues/new?created_from_discussion_number=24277)
   ```
   
   ### What you think should happen instead
   
   It should work
   
   ### How to reproduce
   
   _No response_
   
   ### Anything else
   
   This looks like it is related to this issue: 
https://github.com/apache/airflow/pull/24416
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to