scrottty opened a new issue, #25145:
URL: https://github.com/apache/airflow/issues/25145

   ### Apache Airflow version
   
   2.2.3
   
   ### What happened
   
   When running code from an import module, e.g. `boto3`, the logs are not 
printed to the Airflow log. When running on my local machine, the logs are 
printed to the console after settings 
`logging.basicConfig(level=logging.DEBUG)`. Setting the environmental variable 
`AIRFLOW__LOGGING__LOGGING_LEVEL` didnt help either.
   
   Here is a copy of the DAG code used:
   ```python
   import datetime
   
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   
   import logging
   
   import boto3
   
   def test_log():
     logger = logging.getLogger('test_logger')
   
     logger.debug('debug')
     logger.info('info')
     logger.warn('warn')
     logger.error('error')
     logger.critical('critical')
   
     logging.basicConfig(level=logging.DEBUG)
     
     client = boto3.client('s3')
   
   with DAG(
     'test_log',
     description='test log',
     schedule_interval=None,
     start_date=datetime.datetime(2022,7,18),
     catchup=False,
     tags = ['logs']
   ) as dag:
     
     task = PythonOperator(
       task_id='test_log_task',
       python_callable=test_log,
       dag=dag
     )
   ```
   And here is the Airflow log file:
   ```
   *** Reading remote log from 
s3://airflow-logs-dev-ollie/v1/test_log/test_log_task/2022-07-18T21:54:30.228267+00:00/6.log.
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for <TaskInstance: test_log.test_log_task 
manual__2022-07-18T21:54:30.228267+00:00 [queued]>
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1035} INFO - Dependencies all 
met for <TaskInstance: test_log.test_log_task 
manual__2022-07-18T21:54:30.228267+00:00 [queued]>
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1241} INFO - 
   
--------------------------------------------------------------------------------
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1242} INFO - Starting attempt 6 
of 6
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1243} INFO - 
   
--------------------------------------------------------------------------------
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1262} INFO - Executing 
<Task(PythonOperator): test_log_task> on 2022-07-18 21:54:30.228267+00:00
   [2022-07-18, 22:56:59 UTC] {standard_task_runner.py:52} INFO - Started 
process 13 to run task
   [2022-07-18, 22:56:59 UTC] {standard_task_runner.py:76} INFO - Running: 
['airflow', 'tasks', 'run', 'test_log', 'test_log_task', 
'manual__2022-07-18T21:54:30.228267+00:00', '--job-id', '53291', '--raw', 
'--subdir', 'DAGS_FOLDER/log_test.py', '--cfg-path', '/tmp/tmpgninlo77', 
'--error-file', '/tmp/tmpfjgy5csk']
   [2022-07-18, 22:56:59 UTC] {standard_task_runner.py:77} INFO - Job 53291: 
Subtask test_log_task
   [2022-07-18, 22:56:59 UTC] {logging_mixin.py:109} INFO - Running 
<TaskInstance: test_log.test_log_task manual__2022-07-18T21:54:30.228267+00:00 
[running]> on host testlogtestlogtask.482126fd3d2f40509477941f31efd31b
   [2022-07-18, 22:56:59 UTC] {configuration.py:668} WARNING - Ignoring unknown 
env var 'AIRFLOW__KUBERNETES_SECRETS__aws_creds'
   [2022-07-18, 22:56:59 UTC] {configuration.py:668} WARNING - Ignoring unknown 
env var 'AIRFLOW__KUBERNETES_SECRETS__aws_config'
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1427} INFO - Exporting the 
following env vars:
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=test_log
   AIRFLOW_CTX_TASK_ID=test_log_task
   AIRFLOW_CTX_EXECUTION_DATE=2022-07-18T21:54:30.228267+00:00
   AIRFLOW_CTX_DAG_RUN_ID=manual__2022-07-18T21:54:30.228267+00:00
   [2022-07-18, 22:56:59 UTC] {log_test.py:14} INFO - info
   [2022-07-18, 22:56:59 UTC] {log_test.py:15} WARNING - warn
   [2022-07-18, 22:56:59 UTC] {log_test.py:16} ERROR - error
   [2022-07-18, 22:56:59 UTC] {log_test.py:17} CRITICAL - critical
   [2022-07-18, 22:56:59 UTC] {python.py:152} INFO - Done. Returned value was: 
None
   [2022-07-18, 22:56:59 UTC] {taskinstance.py:1270} INFO - Marking task as 
SUCCESS. dag_id=test_log, task_id=test_log_task, 
execution_date=20220718T215430, start_date=20220718T225659, 
end_date=20220718T225659
   [2022-07-18, 22:56:59 UTC] {local_task_job.py:154} INFO - Task exited with 
return code 0
   ```
   
   The log statements from within the DAG are shown down to `DEBUG` but the 
logs from boto3 aren't shown.
   
   ### What you think should happen instead
   
   The log statements from the called module should be printed to the Airflow 
log to the level set in `logging.basicConfig(level=logging.DEBUG)` or in 
   ```
   logger = logging.getLogger('logger_name')
   logger.setLevel(level=logging.DEBUG)
   ```
   
   ### How to reproduce
   
   The code above should reproduce the issue. Let me know if you need more!
   
   ### Operating System
   
   Debian GNU/Linux 10 (buster)
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-amazon==2.4.0
   apache-airflow-providers-celery==2.1.0
   apache-airflow-providers-cncf-kubernetes==3.0.0
   apache-airflow-providers-docker==2.3.0
   apache-airflow-providers-elasticsearch==2.1.0
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-google==6.1.0
   apache-airflow-providers-grpc==2.0.1
   apache-airflow-providers-hashicorp==2.1.1
   apache-airflow-providers-http==2.0.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-microsoft-azure==3.3.0
   apache-airflow-providers-mysql==2.1.1
   apache-airflow-providers-odbc==2.0.1
   apache-airflow-providers-postgres==2.3.0
   apache-airflow-providers-redis==2.0.1
   apache-airflow-providers-sendgrid==2.0.1
   apache-airflow-providers-sftp==2.2.0
   apache-airflow-providers-slack==4.1.0
   apache-airflow-providers-sqlite==2.0.1
   apache-airflow-providers-ssh==2.3.0
   ```
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   Deployed on EKS v1.22
   
   ### Anything else
   
   Occurs everytime
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to