dineshmarimu2 opened a new issue, #51360:
URL: https://github.com/apache/airflow/issues/51360

   ### Apache Airflow version
   
   3.0.1
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   Task logs on the UI have a terminal entry created by Airflow stating the 
source process responsible for issuing the logline. This clutters the logs 
viewport and makes it hard to debug logs. Example logs showing the additional 
messages are pasted below:
   
   
![Image](https://github.com/user-attachments/assets/c83d69f7-8130-484b-83b5-6f1f47e44522)
   
   
![Image](https://github.com/user-attachments/assets/0df7a706-4fed-49fa-a878-f9639d009d6b)
   
   
![Image](https://github.com/user-attachments/assets/36a68c50-6252-4fa4-8845-a378f75884ee)
   
   
   ### What you think should happen instead?
   
   I would like to be able to decide if I wish to retain such additional 
information on the logs about the source triggering it, using Airflow 
environment variables.
   
   ### How to reproduce
   
   from airflow.decorators import dag, task
   from datetime import datetime, timedelta
   import requests
   import logging
   
   # Define default_args dictionary to specify the default parameters of the 
DAG.
   # Removed new line to check api response
   default_args = {
       'owner': 'airflow',
       'start_date': datetime(2023, 1, 1),
       'retries': 0,
       'retry_delay': timedelta(minutes=5),
   }
   
   # Instantiate the DAG using the @dag decorator.
   @dag(default_args=default_args, catchup=False)
   def get_ip_dag():
   
       # Define the task function to get the IP address.
       @task
       def get_ip():
           response = requests.get('https://api64.ipify.org?format=json').json()
           ip_address = response["ip"]
           logging.info(f"IP Address: {ip_address}")
           return ip_address
   
       # Task to get the IP address using the get_ip function.
       ip_task = get_ip()
   
   # Instantiate the DAG.
   dag_instance = get_ip_dag()
   
   # Set the task dependencies.
   if __name__ == "__main__":
       dag_instance.cli()
   
   
   ### Operating System
   
   Ubuntu Linux
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   My deployment uses Official helm chart version 1.16.0 Airflow 3.0.1 Python 
3.12 on Azure Kubernetes Service with kubernetes version 1.30.11
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to