matejpavlovic-maistra opened a new issue #19426:
URL: https://github.com/apache/airflow/issues/19426


   ### Apache Airflow version
   
   2.2.1 (latest released)
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==2.3.0
   apache-airflow-providers-celery==2.1.0
   apache-airflow-providers-cncf-kubernetes==2.0.3
   apache-airflow-providers-docker==2.2.0
   apache-airflow-providers-elasticsearch==2.0.3
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-google==6.0.0
   apache-airflow-providers-grpc==2.0.1
   apache-airflow-providers-hashicorp==2.1.1
   apache-airflow-providers-http==2.0.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-microsoft-azure==3.2.0
   apache-airflow-providers-mysql==2.1.1
   apache-airflow-providers-odbc==2.0.1
   apache-airflow-providers-postgres==2.3.0
   apache-airflow-providers-redis==2.0.1
   apache-airflow-providers-sendgrid==2.0.1
   apache-airflow-providers-sftp==2.1.1
   apache-airflow-providers-slack==4.1.0
   apache-airflow-providers-sqlite==2.0.1
   apache-airflow-providers-ssh==2.2.0
   
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   ECS operator is not fetching CloudWatch Logs to display in Airflow UI.
   
   Definiton:
   `hello_world = ECSOperator(
        task_id="hello_world",
       dag=dag,
     cluster='fargate-test',
       launch_type='FARGATE',
       task_definition='fargate-test',
     network_configuration={
           'awsvpcConfiguration': {
               'subnets': [
                   .....
               ],
               'assignPublicIp': 'ENABLED'
           }
       },
       overrides={"containerOverrides": [ {'name': 'fargate-test', 'command' : 
[comand]}] },
       awslogs_group = '/ecs/fargate-test',
       awslogs_stream_prefix = '/ecs/fargate-test',
       awslogs_region= 'eu-central-1',
   reattach = True)`
   
   Airflow UI logs look like:
   
   `[2021-11-05, 14:50:00 CET] {ecs.py:313} INFO - Starting ECS Task Log Fetcher
   [2021-11-05, 14:50:00 CET] {credentials.py:1225} INFO - Found credentials in 
shared credentials file: ~/.aws/credentials
   [2021-11-05, 14:51:32 CET] {ecs.py:450} INFO - ECS Task stopped, check 
status: {...........}
   [2021-11-05, 14:51:32 CET] {ecs.py:328} INFO - ECS Task has been 
successfully executed
   [2021-11-05, 14:51:32 CET] {taskinstance.py:1280} INFO - Marking task as 
SUCCESS. dag_id=ecs_fargate_dag, task_id=hello_world, 
execution_date=20211105T134958, start_date=20211105T134959, 
end_date=20211105T135132
   [2021-11-05, 14:51:32 CET] {local_task_job.py:154} INFO - Task exited with 
return code 0
   [2021-11-05, 14:51:32 CET] {local_task_job.py:264} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   `
   
   
   ### What you expected to happen
   
   In my CloudWatch I have additional logs which are not show here (bug).
   When I fetch logs with:
   `session =  boto3.session.Session(region_name='eu-central-1')
   cli = session.client('logs')
   cli.get_log_events(logGroupName='/ecs/fargate-test',logStreamName 
='ecs/fargate-test/<id>')`
   I get full logs.
   
   
   ### How to reproduce
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to