guillaumeblaquiere opened a new issue, #40559:
URL: https://github.com/apache/airflow/issues/40559

   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   All, even in the latest, there isn't the correct line of code.
   
   ### Apache Airflow version
   
   2.6.3
   
   ### Operating System
   
   composer-2.6.0
   
   ### Deployment
   
   Google Cloud Composer
   
   ### Deployment details
   
   Composer 2 version
   composer-2.6.0-airflow-2.6.3
   
   
   ### What happened
   
   When using the CloudBatchSubmitJobOperator, the project_id is a required 
parameter. However, this parameter is ignored (not added in the hook 
submit_batch_job). Therefore, the runtime project (where Composer is deployed) 
is always taken into account and never the target project (where the real 
workload must be done)
   
   ### What you think should happen instead
   
   The project_id provided in the operator should be forwarded to the 
subsequent hook. All the other operation (get, list, delete) have this feature 
well implemented. Only the submit has this issue
   
   ### How to reproduce
   
   Create a Cloud Composer in a project, Configure a 
CloudBatchSubmitJobOperator, with an explicit project_id referencing an other 
project. The Cloud Batch deployment is made in the Cloud Composer project, and 
not in the other project
   
   ### Anything else
   
   I will submit the fix
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to