ivan-toriya-precis opened a new issue, #32918:
URL: https://github.com/apache/airflow/issues/32918

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   Airflow Version: 2.5.3
   
   In my_connection_X (project_id_X) I have my_service_account_X, which has 
BigQuery Admin role on dataset_Y scope (in project_id_Y).
   
   ```python
   gcs_to_bq = GCSToBigQueryOperator(
       task_id="gcs_to_bq",
       bucket="bucket_name_Y",
       source_format="CSV",
       compression="GZIP",
       source_objects=f"{DIR_NAME}/{FILE_NAME}.csv",
       destination_project_dataset_table="project_id_Y.dataset_Y.table_Y",
       write_disposition="WRITE_TRUNCATE",
       gcp_conn_id="my_connection_X",
       skip_leading_rows=1,
       impersonation_chain=my_service_account_X
   ```
   
   We were using Cloud Composer 1 (Airflow 2.2.5) and it worked, when we 
switched to Cloud Composer 2 (Airflow 2.5.3) we have an error.
   
   ### What you think should happen instead
   
   The dataset scope access should work as before.
   
   But now, we can see the error about **project** scope.
   
   ```sh
   [2023-07-28, 13:22:49 UTC] {taskinstance.py:1778} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py",
 line 378, in execute
       job: BigQueryJob | UnknownJob = self._submit_job(self.hook, job_id)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py",
 line 300, in _submit_job
       return hook.insert_job(
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/common/hooks/base_google.py",
 line 468, in inner_wrapper
       return func(self, *args, **kwargs)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py",
 line 1595, in insert_job
       job_api_repr._begin()
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/cloud/bigquery/job/base.py", 
line 693, in _begin
       api_response = client._call_api(
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/cloud/bigquery/client.py", 
line 816, in _call_api
       return call()
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/api_core/retry.py", line 
349, in retry_wrapped_func
       return retry_target(
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/api_core/retry.py", line 
191, in retry_target
       return target()
     File 
"/opt/python3.8/lib/python3.8/site-packages/google/cloud/_http/__init__.py", 
line 494, in api_request
       raise exceptions.from_http_response(response)
   google.api_core.exceptions.Forbidden: 403 POST 
https://bigquery.googleapis.com/bigquery/v2/projects/project_id_Y/jobs?prettyPrint=false:
 Access Denied: Project project_id_Y: User does not have bigquery.jobs.create 
permission in project project_id_Y.
   ``` 
   
   ### How to reproduce
   
   1. You need to have two GCP projects.
   2. Create my_service_account_X in project_id_X
   3. Assign my_service_account_X BigQuery Admin role on dataset_Y scope (in 
project_id_Y)
   4. Try to call GCSToBigQueryOperator on any file.
   
   ### Operating System
   
   Ubuntu 20.04.6 LTS
   
   ### Versions of Apache Airflow Providers
   
   ```sh
   apache-airflow-providers-apache-beam @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_apache_beam-5.1.1-py3-none-any.whl
   apache-airflow-providers-cncf-kubernetes @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_cncf_kubernetes-7.1.0-py3-none-any.whl
   apache-airflow-providers-common-sql @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_common_sql-1.5.2-py3-none-any.whl
   apache-airflow-providers-dbt-cloud @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_dbt_cloud-3.2.1-py3-none-any.whl
   apache-airflow-providers-ftp @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_ftp-3.4.2-py3-none-any.whl
   apache-airflow-providers-google @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_google-2023.6.6%2Bcomposer-py3-none-any.whl
   apache-airflow-providers-hashicorp @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_hashicorp-3.4.1-py3-none-any.whl
   apache-airflow-providers-http @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_http-4.4.2-py3-none-any.whl
   apache-airflow-providers-imap @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_imap-3.2.2-py3-none-any.whl
   apache-airflow-providers-mysql @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_mysql-5.1.1-py3-none-any.whl
   apache-airflow-providers-postgres @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_postgres-5.5.1-py3-none-any.whl
   apache-airflow-providers-sendgrid @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_sendgrid-3.2.1-py3-none-any.whl
   apache-airflow-providers-sftp==4.4.0
   apache-airflow-providers-slack==7.3.1
   apache-airflow-providers-sqlite @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_sqlite-3.4.2-py3-none-any.whl
   apache-airflow-providers-ssh @ 
file:///usr/local/lib/airflow-pypi-dependencies-2.5.3/python3.8/apache_airflow_providers_ssh-3.7.1-py3-none-any.whl
   ```
   
   ### Deployment
   
   Google Cloud Composer
   
   ### Deployment details
   
   composer-2.3.4-airflow-2.5.3
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to