GitHub user aayostem added a comment to the discussion: DAG runs perfectly when 
I trigger it manually, but when it runs on its scheduled interval, some tasks 
fail with AirflowTaskTimeout errors

could be caused by 
 - worker resource exhaustion If using Celery/KubernetesExecutor, workers might 
be overloaded.
 - scheduler overload, it can't keep up with parsing and scheduling all DAGs.
```
# Increase worker resources or adjust concurrency
[celery]
worker_concurrency = 8  # Reduce if tasks are memory-intensive

# Or use pools for resource management
# In Airflow UI: Admin -> Pools
# Create pools for different resource requirements
```

if it scheduler overload
```
# Increase scheduler resources
[scheduler]
max_threads = 4                    # Increase from default 2
parsing_processes = 2              # If you have multiple CPU cores
scheduler_heartbeat_sec = 3        # Faster heartbeat detection

# Reduce parsing frequency
[core]
dagbag_import_timeout = 30         # Increase timeout
min_serialized_dag_fetch_sec = 15  # Reduce frequency for serialized DAGs
```

GitHub link: 
https://github.com/apache/airflow/discussions/58363#discussioncomment-14980597

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to