ricky-chaoju commented on PR #59918:
URL: https://github.com/apache/airflow/pull/59918#issuecomment-3758566966
> LGTM
>
> Can you run the command with test data to confirm the functionality?
```
rickychen@airflow-dev:~/Desktop/airflow/airflow$ airflow db migrate
2026-01-16T07:31:07.753754Z [info ] setup plugin
alembic.autogenerate.schemas [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:07.753981Z [info ] setup plugin
alembic.autogenerate.tables [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:07.754136Z [info ] setup plugin
alembic.autogenerate.types [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:07.754253Z [info ] setup plugin
alembic.autogenerate.constraints [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:07.754362Z [info ] setup plugin
alembic.autogenerate.defaults [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:07.754469Z [info ] setup plugin
alembic.autogenerate.comments [alembic.runtime.plugins] loc=plugins.py:37
DB: sqlite:////home/rickychen/airflow/airflow.db
Performing upgrade to the metadata database
sqlite:////home/rickychen/airflow/airflow.db
2026-01-16T07:31:08.888657Z [info ] Context impl SQLiteImpl.
[alembic.runtime.migration] loc=migration.py:210
2026-01-16T07:31:08.888864Z [info ] Will assume non-transactional DDL.
[alembic.runtime.migration] loc=migration.py:213
2026-01-16T07:31:08.891245Z [info ] Migrating the Airflow database
[airflow.utils.db] loc=db.py:1129
2026-01-16T07:31:08.896066Z [info ] Context impl SQLiteImpl.
[alembic.runtime.migration] loc=migration.py:210
2026-01-16T07:31:08.896213Z [info ] Will assume non-transactional DDL.
[alembic.runtime.migration] loc=migration.py:213
2026-01-16T07:31:08.923112Z [info ] Context impl SQLiteImpl.
[alembic.runtime.migration] loc=migration.py:210
2026-01-16T07:31:08.923245Z [info ] Will assume non-transactional DDL.
[alembic.runtime.migration] loc=migration.py:213
Database migrating done!
rickychen@airflow-dev:~/Desktop/airflow/airflow$ airflow db clean --dry-run
--clean-before-timestamp "2026-01-01"
2026-01-16T07:31:17.493968Z [info ] setup plugin
alembic.autogenerate.schemas [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:17.494237Z [info ] setup plugin
alembic.autogenerate.tables [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:17.494393Z [info ] setup plugin
alembic.autogenerate.types [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:17.494528Z [info ] setup plugin
alembic.autogenerate.constraints [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:17.494689Z [info ] setup plugin
alembic.autogenerate.defaults [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:31:17.494804Z [info ] setup plugin
alembic.autogenerate.comments [alembic.runtime.plugins] loc=plugins.py:37
Performing dry run for db cleanup.
Data prior to 2026-01-01 00:00:00+00:00 would be purged from tables
['callback_request', '_xcom_archive', 'task_reschedule',
'task_instance_history', 'xcom', 'task_instance', 'deadline', 'dag_run',
'dag_version', 'trigger', 'dag', 'celery_tasksetmeta', 'log',
'celery_taskmeta', 'import_error', 'asset_event', 'sla_miss', 'job'] with the
following config:
table | recency_column | keep_last | keep_last_filters |
keep_last_group_by
===================+====================+===========+===================+===================
callback_request | callback_request.c | False | None |
None
| reated_at | | |
_xcom_archive | _xcom_archive.time | False | None |
None
| stamp | | |
task_reschedule | task_reschedule.st | False | None |
None
| art_date | | |
task_instance_hist | task_instance_hist | False | None |
None
ory | ory.start_date | | |
xcom | xcom.timestamp | False | None |
None
task_instance | task_instance.star | False | None |
None
| t_date | | |
deadline | deadline.deadline_ | False | None |
None
| time | | |
dag_run | dag_run.start_date | True | ['run_type != |
['dag_id']
| | | :run_type_1'] |
dag_version | dag_version.create | False | None |
None
| d_at | | |
trigger | trigger.created_da | False | None |
None
| te | | |
dag | dag.last_parsed_ti | False | None |
None
| me | | |
celery_tasksetmeta | celery_tasksetmeta | False | None |
None
| .date_done | | |
log | log.dttm | False | None |
None
celery_taskmeta | celery_taskmeta.da | False | None |
None
| te_done | | |
import_error | import_error.times | False | None |
None
| tamp | | |
asset_event | asset_event.timest | False | None |
None
| amp | | |
sla_miss | sla_miss.timestamp | False | None |
None
job | job.latest_heartbe | False | None |
None
| at | | |
Performing dry run for table callback_request
Checking table callback_request
Found 0 rows meeting deletion criteria.
2026-01-16T07:31:18.745211Z [warning ] Table _xcom_archive not found.
Skipping. [airflow.utils.db_cleanup] loc=db_cleanup.py:552
Performing dry run for table task_reschedule
Checking table task_reschedule
Found 0 rows meeting deletion criteria.
Performing dry run for table task_instance_history
Checking table task_instance_history
Found 0 rows meeting deletion criteria.
Performing dry run for table xcom
Checking table xcom
Found 0 rows meeting deletion criteria.
Performing dry run for table task_instance
Checking table task_instance
Found 0 rows meeting deletion criteria.
Performing dry run for table deadline
Checking table deadline
Found 0 rows meeting deletion criteria.
Performing dry run for table dag_run
Checking table dag_run
Found 0 rows meeting deletion criteria.
Performing dry run for table dag_version
Checking table dag_version
Found 0 rows meeting deletion criteria.
Performing dry run for table trigger
Checking table trigger
Found 0 rows meeting deletion criteria.
Performing dry run for table dag
Checking table dag
Found 0 rows meeting deletion criteria.
2026-01-16T07:31:18.764759Z [warning ] Table celery_tasksetmeta not found.
Skipping. [airflow.utils.db_cleanup] loc=db_cleanup.py:552
Performing dry run for table log
Checking table log
Found 0 rows meeting deletion criteria.
2026-01-16T07:31:18.766781Z [warning ] Table celery_taskmeta not found.
Skipping. [airflow.utils.db_cleanup] loc=db_cleanup.py:552
Performing dry run for table import_error
Checking table import_error
Found 0 rows meeting deletion criteria.
Performing dry run for table asset_event
Checking table asset_event
Found 0 rows meeting deletion criteria.
2026-01-16T07:31:18.770724Z [warning ] Table sla_miss not found. Skipping.
[airflow.utils.db_cleanup] loc=db_cleanup.py:552
Performing dry run for table job
Checking table job
Found 0 rows meeting deletion criteria.
rickychen@airflow-dev:~/Desktop/airflow/airflow$ python3 <<'EOF'
from airflow.dag_processing.bundles.manager import DagBundlesManager
from airflow.utils.session import create_session
manager = DagBundlesManager()
with create_session() as session:
manager.sync_bundles_to_db(session=session)
print("sync_bundles_to_db() success")
print("Bundles:", [b.name for b in manager.get_all_dag_bundles()])
EOF
2026-01-16T07:32:56.149137Z [info ] setup plugin
alembic.autogenerate.schemas [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:32:56.149383Z [info ] setup plugin
alembic.autogenerate.tables [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:32:56.149584Z [info ] setup plugin
alembic.autogenerate.types [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:32:56.149732Z [info ] setup plugin
alembic.autogenerate.constraints [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:32:56.149900Z [info ] setup plugin
alembic.autogenerate.defaults [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:32:56.150033Z [info ] setup plugin
alembic.autogenerate.comments [alembic.runtime.plugins] loc=plugins.py:37
2026-01-16T07:32:56.228499Z [info ] DAG bundles loaded: dags-folder,
example_dags [airflow.dag_processing.bundles.manager.DagBundlesManager]
loc=manager.py:179
sync_bundles_to_db() success
Bundles: ['dags-folder', 'example_dags']
rickychen@airflow-dev:~/Desktop/airflow/airflow$
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]