potiuk opened a new issue, #60971: URL: https://github.com/apache/airflow/issues/60971
### Description Currently airlfow standalone multiplexes logs at the stdout, and while it is useful for non-interactive usage, when you run `airflow standalone` as an interactive user (for example when you want to do some Dag development related tasks, or even when you develop airflow itself), it's not really helpful to see those logs multiplexed. Recently in `breeze` we switched to `mprocs` as a default terminal multiplexer and after discussion in https://lists.apache.org/thread/lj1lj2typ8mwj75pmjt3gs1o6k9p7dmf the conclusion is that `mprocs` would be a great terminal/log multiplexer for `airflow standalone --interactive` mode (new mode that we would like to add). It has some nice featuers: * intuitive TUI interface * ability to control, restart and generally manage each of the sub-processes * ability to see logs separately for each process * intuitive keyboard and mouse navigation * nice copy&paste support for issue reporting and diagnostics ### Use case/motivation Interactive `airflow standalone` would be nice for anyone using `airflow standalone` to test and debug Dags. It is also good to test and debug airflow itself - for local virtualenve development it might be a convenien,t local equivalent of `start-airflow` without the need of having docker and docker compose (with limitations of course - only working for sqlite, and only supporting local executor). ### Related issues * https://github.com/apache/airflow/pull/58702 * https://github.com/apache/airflow/pull/58718 * https://github.com/apache/airflow/pull/60844 ### Are you willing to submit a PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [x] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
