wangxiaojun1990 opened a new issue, #63551:
URL: https://github.com/apache/airflow/issues/63551

   ### Apache Airflow version
   3.1.8
   
   ### What happened
   When deploying Airflow 3.1.8 with a fresh database (no existing 
`default_pool` in the `pool` table), tasks are stuck in `scheduled` state and 
never transition to `queued` or `running`. The scheduler logs show "All pools 
are full!" even though no pools exist in the database.
   
   ### Root Cause Analysis
   The scheduler's `slots_stats()` method queries the `pool` table and returns 
an empty dictionary when no pools exist:
   
   ```python
   # airflow/jobs/scheduler_job_runner.py:363-371
   pools = Pool.slots_stats(lock_rows=True, session=session)
   pool_slots_free = sum(max(0, pool["open"]) for pool in pools.values())
   
   if pool_slots_free == 0:
       self.log.debug("All pools are full!")
       return []  # No tasks are scheduled
   ```
   
   The configuration `default_pool_task_slot_count = 128` exists, but the 
`default_pool` record is not automatically created in the database during 
`airflow db migrate`.
   
   ### How to reproduce
   1. Deploy Airflow 3.1.8 with a fresh database
   2. 2. Create a simple DAG with a PythonOperator task
   3. 3. Trigger the DAG from UI or CLI
   4. 4. Task stays in `scheduled` state indefinitely
   ### Environment
   - **Airflow version**: 3.1.8
   - - **Helm chart version**: 1.19.0
   - - - **Executor**: CeleryExecutor
   - - - - **Database**: MySQL 8.0
   ### Verification commands
   ```bash
   # Check pools - returns empty
   airflow pools list
   # Output: No data found
   
   # Check default_pool_task_slot_count - returns 128
   airflow config get-value core default_pool_task_slot_count
   # Output: 128
   
   # Check get_default_pool() - returns None
   python -c "from airflow.models.pool import Pool; 
print(Pool.get_default_pool())"
   # Output: None
   ```
   
   ### Expected behavior
   Either:
   1. `default_pool` should be automatically created during `airflow db 
migrate`, OR
   2. 2. Scheduler should handle missing `default_pool` gracefully by using the 
configured `default_pool_task_slot_count`
   ### Workaround
   Manually create the default pool after database migration:
   ```bash
   airflow pools set default_pool 128 "Default pool"
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to