Taragolis opened a new issue, #29556:
URL: https://github.com/apache/airflow/issues/29556

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   main/develop
   
   ### Apache Airflow version
   
   main/develop
   
   ### Operating System
   
   Not relevant
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   Not relevant
   
   ### What happened
   
   
[`EcsCreateClusterOperator`](https://github.com/apache/airflow/blob/2a34dc9e8470285b0ed2db71109ef4265e29688b/airflow/providers/amazon/aws/operators/ecs.py#L112-L118),
 
[`EcsDeleteClusterOperator`](https://github.com/apache/airflow/blob/2a34dc9e8470285b0ed2db71109ef4265e29688b/airflow/providers/amazon/aws/operators/ecs.py#L153-L161),
 
[`EcsDeregisterTaskDefinitionOperator`](https://github.com/apache/airflow/blob/2a34dc9e8470285b0ed2db71109ef4265e29688b/airflow/providers/amazon/aws/operators/ecs.py#L191-L198),
 
[`EcsRegisterTaskDefinitionOperator`](https://github.com/apache/airflow/blob/2a34dc9e8470285b0ed2db71109ef4265e29688b/airflow/providers/amazon/aws/operators/ecs.py#L255-L260)
   
   ### What you think should happen instead
   
   We should use boto3 waiters / hook methods instead of use other operators 
inside of execute methods.
   I do not sure is it safe or not propagate context from one operator to 
another
   
   ### How to reproduce
   
   This is follow up of this discussion: 
https://github.com/apache/airflow/discussions/29504#discussioncomment-4982216
   
   1. Create AWS Connection with name differ than `aws_default`
   2. Make sure that `aws_default` connection not exists
   3. Make sure that Airflow environment can't obtain somehow AWS Credentials 
(so need exclude Environment Variables, shared credential file, IAM profile, 
ECS Task  Execution Role and etc)
   4. Try to execute one of the selected ECS operators with 
`wait_for_completion` set to `True`
      + EcsCreateClusterOperator
      + EcsDeleteClusterOperator
      + EcsDeregisterTaskDefinitionOperator
      + EcsRegisterTaskDefinitionOperator 
   
   
   ```python
   from airflow import DAG
   from airflow.utils.timezone import datetime
   from airflow.providers.amazon.aws.operators.ecs import 
EcsCreateClusterOperator
   
   
   CUSTOM_AWS_CONN_ID = "aws-custom"
   REGION_NAME="eu-west-1"
   
   assert CUSTOM_AWS_CONN_ID != "aws_default", "CUSTOM_AWS_CONN_ID should not 
be defined as 'aws_default'"
   assert CUSTOM_AWS_CONN_ID
   
   
   with DAG(
       "discussion_29504",
       start_date=datetime(2023, 2, 14),
       schedule_interval=None,
       catchup=False,
       tags=["amazon", "ecs", "discussion-29504"]
   ) as dag:
       EcsCreateClusterOperator(
           task_id="whooooops",
           aws_conn_id=CUSTOM_AWS_CONN_ID,
           region=REGION_NAME,
           cluster_name="discussion-29504",
           wait_for_completion=True,
       )
   ```
   
   ### Anything else
   
   Every time when 
   - `wait_for_completion` set to `True` for ECS operators listed before
   - `aws_default` not exist or do not have permission to ECS operations
   -  Fallback to `boto3` default credential strategy can't valid credentials 
with access to ECS operations
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to