okayhooni opened a new issue, #37557: URL: https://github.com/apache/airflow/issues/37557
### Apache Airflow Provider(s) google ### Versions of Apache Airflow Providers latest ### Apache Airflow version 2.3.x ~ latest ### Operating System Debian GNU/Linux 11 (bullseye) ### Deployment Official Apache Airflow Helm Chart ### Deployment details deployed on EKS cluster with customized Airflow Helm chart based on the official chart ### What happened `BigQueryDataTransferServiceStartTransferRunsOperator` got `Access Denied` error on long-running migration jobs. Especially, when BigQuery Data Transfer job triggered by Airflow operator, exceeds one hour, it fails due to the expiration of the credential (default lifespan = 1 hour). But, I sometimes experienced the same `Access Denied` issue **even on the job that has been submitted for less than 10 minutes.** ``` Access Denied: Table projectid:table_name: Permission bigquery.tables.get denied on table projectid:datasetid.table_name (or it may not exist). ``` (those DTS job error logs can also be seen on BigQuery Data transfer console) At first, I thought it was a token expiration issue, so I attempted to refresh the token; however, it did not have any effect on resolving the issue. - https://github.com/apache/airflow/pull/37538 ### What you think should happen instead - `BigQueryDataTransferServiceStartTransferRunsOperator` task has to be success regardless of the running duration of migration jobs. - As I mentioned above, I sometimes experienced the same `Access Denied` issue **even on the job that has been submitted for less than 10 minutes.** (So, I guess it is the issue on the GCP itself) ### How to reproduce - Runnining the BigQuery DTS job for migrating a sufficiently large data source that takes more than an hour. - It can be sometimes reproduced on shorter DTS job (< 10 minutes) ### Anything else Related issue: - https://github.com/apache/airflow/issues/31648 Related MR: - https://github.com/apache/airflow/pull/31651 - https://github.com/apache/airflow/pull/32673 Related Google BigQuery Docs: - https://cloud.google.com/bigquery/docs/transfer-troubleshooting Error: Access Denied: ... Permission bigquery.tables.get denied on table ... Resolution: Confirm that the BigQuery Data Transfer Service [service agent](https://cloud.google.com/bigquery/docs/enable-transfer-service#service_agent) is granted the [bigquery.dataEditor role](https://cloud.google.com/bigquery/docs/access-control#bigquery.dataEditor) on the target dataset. This grant is automatically applied when creating and updating the transfer, but it's possible that the access policy was modified manually afterwards. To regrant the permission, see [Grant access to a dataset](https://cloud.google.com/bigquery/docs/control-access-to-resources-iam#grant_access_to_a_dataset). ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
