Abhimanuchauhan01 opened a new issue, #45905:
URL: https://github.com/apache/airflow/issues/45905

   ### Apache Airflow version
   
   2.10.4
   
   ### If "Other Airflow 2 version" selected, which one?
   
   Airflow Bug Report
   
   ### What happened?
   
   While running a DAG with a task configured to have max_retries=3, the task 
did not stop retrying after 3 failed attempts. Instead, it kept retrying 
indefinitely, ignoring the configured limit.
   
   Context in which the problem occurred:
   I encountered this issue after upgrading Apache Airflow from version 2.7.2 
to 2.7.3. The DAG is set up to use the CeleryExecutor, and the task in question 
is a PythonOperator that calls a simple Python function. The problem is 
reproducible every time the task fails.
   
   ### What you think should happen instead?
   
   The task should fail permanently after exhausting the configured 
max_retries=3 attempts. The system should respect the max_retries parameter and 
not continue retrying indefinitely.
   
   What went wrong?
   It appears that the max_retries parameter is either being ignored or 
overridden. This may be due to a regression introduced in the upgrade from 
version 2.7.2 to 2.7.3, or it could be related to the CeleryExecutor not 
correctly interpreting the max_retries setting.
   
   ### How to reproduce
   
   Steps to Reproduce the Problem:
   
   Environment Setup:
   
   Airflow Version: 2.7.3
   Executor: CeleryExecutor
   Scheduler: Default Airflow Scheduler
   Python Version: 3.10
   Database: PostgreSQL 13
   Operating System: Ubuntu 22.04
   DAG Configuration:
   Create a minimal DAG with the following structure:
   
   python
   Copy
   Edit
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   from datetime import datetime
   
   def fail_task():
       raise ValueError("This task is designed to fail.")
   
   with DAG(
       dag_id='retry_issue_demo',
       start_date=datetime(2025, 1, 1),
       schedule_interval=None,
       catchup=False,
   ) as dag:
       failing_task = PythonOperator(
           task_id='failing_task',
           python_callable=fail_task,
           retries=3,  # Explicitly setting max_retries
       )
   Execution:
   
   Deploy the DAG to your Airflow environment.
   Trigger the DAG manually via the Airflow UI or CLI.
   Observation:
   
   Monitor the task failing_task in the Airflow UI.
   Observe the task retry behavior and note whether it respects the configured 
retries=3.
   Expected Behavior:
   The task should fail permanently after three retry attempts.
   
   Actual Behavior:
   The task continues to retry indefinitely, disregarding the retries parameter.
   
   ### Operating System
   
   Ubuntu 22.04 LTS
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-google==7.0.0
   apache-airflow-providers-amazon==3.0.0
   apache-airflow-providers-slack==2.0.0
   
   
   ### Deployment
   
   Other
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [x] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to