MrManicotti opened a new issue #12761:
URL: https://github.com/apache/airflow/issues/12761


   **Apache Airflow version**:
   2.0.0b3
   
   **What happened**:
   When Airflow is ran within CeleryExecutor mode and using the new Cloudwatch 
integrations (from #7437) log groups and log streams are created, but no log 
events are pushed to AWS. The Webserver is able to read the log stream as 
expected. There are no errors, just empty logs within Cloudwatch:
   `*** Reading remote log from Cloudwatch log_group: airflow-task log_stream: 
tutorial_taskflow_api_etl_dag/extract/2020-12-02T16_37_07.011589+00_00/1.log.`
     
   By switching to SequentialExecutor (default), only then are the logs written 
as expected:  
   ```
   *** Reading remote log from Cloudwatch log_group: airflow-task log_stream: 
tutorial_taskflow_api_etl_dag/extract/2020-12-02T18_04_50.259906+00_00/1.log.
   Task exited with return code 0
   test
   Exporting the following env vars:
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=tutorial_taskflow_api_etl_dag
   AIRFLOW_CTX_TASK_ID=extract
   AIRFLOW_CTX_EXECUTION_DATE=2020-12-02T18:04:50.259906+00:00
   AIRFLOW_CTX_DAG_RUN_ID=manual__2020-12-02T18:04:50.259906+00:00
   Running <TaskInstance: tutorial_taskflow_api_etl_dag.extract 
2020-12-02T18:04:50.259906+00:00 [running]> on host 845928f08686
   Started process 152 to run task
   Executing <Task(_PythonDecoratedOperator): extract> on 
2020-12-02T18:04:50.259906+00:00
   
   
--------------------------------------------------------------------------------
   Starting attempt 1 of 1
   
   
--------------------------------------------------------------------------------
   Dependencies all met for <TaskInstance: 
tutorial_taskflow_api_etl_dag.extract 2020-12-02T18:04:50.259906+00:00 [queued]>
   Dependencies all met for <TaskInstance: 
tutorial_taskflow_api_etl_dag.extract 2020-12-02T18:04:50.259906+00:00 [queued]>
   
   ```  
     
   The logging configuration is turned on for the Scheduler, Webserver, and 
Workers. **Log groups and log streams are being created as expected, but log 
events are not**. There are no errors in the workers, and in AWS the IAM policy 
permissions are all-inclusive (they also definitely work with 
SequentialExecutor)
   
   **What you expected to happen**:
   I expected Celery Worker logs to display for the Webserver when ran with 
Cloudwatch Remote logging.
   
   **How to reproduce it**:
   Run Airflow 2.0.0b3 with Webserver in CeleryExecutor mode, two workers, 1 
scheduler
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to