GitHub user trikker edited a comment on the discussion: Is there a function that allows the celery worker provided by airflow to take DAG from the Rabbitmq Quorum queue?
@jojaeng2 hi jojaeng2, I am testing the HA of all airflow components and may need your help. After I stop the first node(11.165.218.219) of the rabbitmq cluster with the command `rabbitmqctl stop_app`, I find the tasks cannot be scheduled. I don't know why airflow cannot use the second rabbitmq node after I stop the first node. I don't know if the below config of broker_url in ariflow can achieve the goal of HA of rabbitmq if any node fails. RabbitMQ version: 3.13.2 broker url in airflow.cfg: broker_url = amqp://test_user:[email protected]:5672/test_vhost;amqp://test_user:[email protected]:5672/test_vhost;amqp://test_user:[email protected]:5672/test_vhost queues created by airflow seems don't work as expected:  airflow dag instances fail:  Thank you! GitHub link: https://github.com/apache/airflow/discussions/35649#discussioncomment-11300005 ---- This is an automatically sent email for [email protected]. To unsubscribe, please send an email to: [email protected]
