ying-w opened a new issue, #30635:
URL: https://github.com/apache/airflow/issues/30635

   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-google==8.11.0
   google-cloud-bigquery==2.34.4
   
   ### Apache Airflow version
   
   2.5.2+astro.2
   
   ### Operating System
   
   OSX
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   When setting a `project_id` parameter for `BigQueryGetDataOperator` the 
default project from env is not overwritten. Maybe something broke after it was 
added in? https://github.com/apache/airflow/pull/25782
   
   ### What you think should happen instead
   
   Passing in as parameter should take precedence over reading in from 
environment
   
   ### How to reproduce
   
   ```py
   from airflow.providers.google.cloud.operators.bigquery import 
BigQueryGetDataOperator
   
   bq = BigQueryGetDataOperator(
     task_id=f"my_test_query_task_id",
     gcp_conn_id="bigquery",
     table_id="mytable",
     dataset_id="mydataset",
     project="my_non_default_project",
   )
   f2 = bq.execute(None)
   ```
   
   in env i have set
   ```py
   AIRFLOW_CONN_BIGQUERY=gcpbigquery://
   GOOGLE_CLOUD_PROJECT=my_primary_project
   
GOOGLE_APPLICATION_CREDENTIALS=/usr/local/airflow/gcloud/application_default_credentials.json
   ```
   
   The credentials json file doesn't have project
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to