GitHub user trikker edited a comment on the discussion: Is there a function 
that allows the celery worker provided by airflow to take DAG from the Rabbitmq 
Quorum queue?

@jojaeng2 hi jojaeng2, I am testing the HA of all airflow components and may 
need your help. After I stop the first node(11.165.218.219) of the  rabbitmq 
cluster with the command `rabbitmqctl stop_app`, I find the tasks cannot be 
scheduled. I don't know why airflow cannot use the second rabbitmq node after I 
stop the first node. I don't know if the below config of broker_url in ariflow 
can achieve the goal of HA of rabbitmq if any node fails.

RabbitMQ version: 3.13.2
broker url in airflow.cfg: broker_url = 
amqp://test_user:[email protected]:5672/test_vhost;amqp://test_user:[email protected]:5672/test_vhost;amqp://test_user:[email protected]:5672/test_vhost

queues created by airflow seems don't work as expected:

![image](https://github.com/user-attachments/assets/7f9ed280-e5b0-4628-af3d-a9909eb280f3)

airflow dag instances fail:

![image](https://github.com/user-attachments/assets/abb06abe-2300-4c35-be9f-19c790b453dd)


Thank you!

GitHub link: 
https://github.com/apache/airflow/discussions/35649#discussioncomment-11300005

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to