HuanjieGuo opened a new issue, #46856: URL: https://github.com/apache/airflow/issues/46856
### Apache Airflow version Other Airflow 2 version (please specify below) ### If "Other Airflow 2 version" selected, which one? 2.7.2 ### What happened? In our airflow cluster, we ran a prod cluster for 2 years and it has a lot total records in the **log** and **dag_run** table. When I try to run `airflow db clean` for them, it failed because of the big transaction limit. `MySQLdb.OperationalError: (1197, "Multi-statement transaction required more than 'max_binlog_cache_size' bytes of storage; increase this mysqld variable and try again")` The problem is that _airflow_deleted_XXXX is still kept in our MySQL, and every day we would trigger the `airflow db clean` and then it will generate a new '_airflow_deleted_XXXX' table due to the db failure. Finally, we received the email from mysql team to tell us that our mysql cluster disk is out of space. ### What you think should happen instead? When the airflow db clean fail within one table, it should also delete the '_airflow_deleted_XXXX' table. ### How to reproduce create a large table and set the maximum max_binlog_cache_size for your mysql, then run airflow db clean for it. ### Operating System linux ### Versions of Apache Airflow Providers _No response_ ### Deployment Official Apache Airflow Helm Chart ### Deployment details _No response_ ### Anything else? _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [x] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
