bluzy opened a new issue, #33187:
URL: https://github.com/apache/airflow/issues/33187

   ### Apache Airflow version
   
   2.6.3
   
   ### What happened
   
   I am testing dags on local airflow with default sqlite db.
   
   After upgrade airflow version to 2.6.3 from 2.2.4, I encountered `database 
is locked` error when run `airflow db init` or `airflow standalone` command.
   
   I inspected code, then I found that session is not released after delete 
`serialized_dag` table.
   
   
https://github.com/apache/airflow/blob/5a0494f83e8ad0e5cbf0d3dcad3022a3ea89d789/airflow/utils/db.py#L886-L893
   
   After I added `session.commit()` after deleting `serialized_dag` table, it 
works fine.
   Or, manually delete rows in `serialized_dag` table fix this issue..
   
   The problem occurs with our production dag codes, not reproduced with simple 
example dag codes.
   I think only `data` fields' size are difference, but I don't know why it 
makes db lock..
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   Only reproduced with sqlite db.
   I don't know exact condition, but I guess large dags cause the problem..
   
   ### Operating System
   
   MacOS (Apple Silicon)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to