dobixu opened a new issue, #35619:
URL: https://github.com/apache/airflow/issues/35619

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   [2023-11-14T11:31:46.663+0800] {base_events.py:1753} ERROR - Future 
exception was never retrieved
   future: <Future finished exception=ValueError("'master_name' transport 
option must be specified.")>
   Traceback (most recent call last):
     File 
"/usr/local/lib/python3.9/site-packages/kombu/transport/virtual/base.py", line 
951, in create_channel
       return self._avail_channels.pop()
   IndexError: pop from empty list
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "/usr/lib64/python3.9/concurrent/futures/thread.py", line 58, in run
       result = self.fn(*self.args, **self.kwargs)
     File "/usr/local/lib/python3.9/site-packages/celery/app/control.py", line 
662, in enable_events
       return self.broadcast(
     File "/usr/local/lib/python3.9/site-packages/celery/app/control.py", line 
776, in broadcast
       return self.mailbox(conn)._broadcast(
     File "/usr/local/lib/python3.9/site-packages/kombu/pidbox.py", line 330, 
in _broadcast
       chan = channel or self.connection.default_channel
     File "/usr/local/lib/python3.9/site-packages/kombu/connection.py", line 
953, in default_channel
       self._ensure_connection(**conn_opts)
     File "/usr/local/lib/python3.9/site-packages/kombu/connection.py", line 
459, in _ensure_connection
       return retry_over_time(
     File "/usr/local/lib/python3.9/site-packages/kombu/utils/functional.py", 
line 318, in retry_over_time
       return fun(*args, **kwargs)
     File "/usr/local/lib/python3.9/site-packages/kombu/connection.py", line 
934, in _connection_factory
       self._connection = self._establish_connection()
     File "/usr/local/lib/python3.9/site-packages/kombu/connection.py", line 
860, in _establish_connection
       conn = self.transport.establish_connection()
     File 
"/usr/local/lib/python3.9/site-packages/kombu/transport/virtual/base.py", line 
975, in establish_connection
       self._avail_channels.append(self.create_channel(self))
     File 
"/usr/local/lib/python3.9/site-packages/kombu/transport/virtual/base.py", line 
953, in create_channel
       channel = self.Channel(connection)
     File "/usr/local/lib/python3.9/site-packages/kombu/transport/redis.py", 
line 741, in __init__
       self.client.ping()
     File "/usr/local/lib/python3.9/site-packages/kombu/utils/objects.py", line 
31, in __get__
       return super().__get__(instance, owner)
     File "/usr/lib64/python3.9/functools.py", line 993, in __get__
       val = self.func(instance)
     File "/usr/local/lib/python3.9/site-packages/kombu/transport/redis.py", 
line 1254, in client
       return self._create_client(asynchronous=True)
     File "/usr/local/lib/python3.9/site-packages/kombu/transport/redis.py", 
line 1210, in _create_client
       return self.Client(connection_pool=self.async_pool)
     File "/usr/local/lib/python3.9/site-packages/kombu/transport/redis.py", 
line 1248, in async_pool
       self._async_pool = self._get_pool(asynchronous=True)
     File "/usr/local/lib/python3.9/site-packages/kombu/transport/redis.py", 
line 1433, in _get_pool
       return self._sentinel_managed_pool(asynchronous)
     File "/usr/local/lib/python3.9/site-packages/kombu/transport/redis.py", 
line 1423, in _sentinel_managed_pool
       raise ValueError(
   ValueError: 'master_name' transport option must be specified.
   
   
   ### What you think should happen instead
   
   The correct configuration should support a Redis Sentinel cluster. Please 
provide a airflow.cnf default Redis Sentinel cluster configuration. Thank you 
very much.
   
   ### How to reproduce
   
   ## Error Report File 
[kombu/transport/redis.py](https://github.com/celery/kombu/blob/main/kombu/transport/redis.py)
   
   ## It works as expected on single Redis Configuration :
   [celery]
   broker_url = redis://:[email protected]:16379/0
   
   
   ## Stably based on the  Redis Sentinel cluster runs normally,I have tried 
multiple configuration approaches, but none of them have resolved the issue. 
The configurations attempted are as follows:
   1. Example
   > [celery]
   broker_url = 
redis+sentinel://:[email protected]:26379,172.16.0.204:26379,172.16.0.206:26379/master
   
   2. Example
   > [celery]
   broker_url = 
sentinel://[email protected]:26379/0;sentinel://[email protected]:26379/0;sentinel://[email protected]:26379/master
   BROKER_TRANSPORT_OPTIONS = {"master_name": "master"}
   3. Example
   
   
   
   
   ### Operating System
   
   AlmaLinux release 9.2 
   
   ### Versions of Apache Airflow Providers
   
   2.3.3/2.7.2
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   version: '3.8'
   networks:
     airflow:
       ipam:
         config:
           - subnet: 172.178.0.0/24
   x-airflow-common:
     &airflow-common
     image: registry.manyun-inc.com/services/airflow:2.7.2
     #image: registry.manyun-inc.com/services/airflow:2.3.3
     environment:
       &airflow-common-env
       TZ: 'Asia/Shanghai'
       AIRFLOW_HOME: '/data/airflow'
       AIRFLOW_CORE_DEFAULT_TIMEZONE: Asia/Shanghai 
       AIRFLOW_WEBSERVER_DEFAULT_TIMEZONE: Asia/Shanghai 
       #  env_file:
       #- mysql.env
     volumes:
             #    - /etc/localtime:/etc/localtime:ro
             #    - /etc/timezone:/etc/timezone:ro
       - /var/run/docker.sock:/var/run/docker.sock
       - ./dags:/data/airflow/dags
       - ./logs:/data/airflow/logs
         #- ./d_task:/data/airflow/d-task
       - ./plugins:/data/airflow/plugins
       - ./airflow.cfg:/data/airflow/airflow.cfg
   services:
     airflow-webserver:
       <<: *airflow-common
       container_name: airflow.webserver
       command: webserver
       ports:
         - 38080:8080
       healthcheck:
         test: ["CMD", "curl", "--fail", "http://localhost:8080/health";]
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
       networks:
         - airflow
     airflow-scheduler1:
       <<: *airflow-common
       container_name: airflow.scheduler_1
       command: scheduler
       healthcheck:
         test: "ps axu | grep -vE 'grep|restart' | grep 'scheduler'"
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
       networks:
         - airflow
     airflow-worker:
       <<: *airflow-common
       container_name: airflow.worker
       command: celery worker
       healthcheck:
         test:
           - "CMD-SHELL"
           - 'celery --app airflow.executors.celery_executor.app inspect ping 
-d "celery@$${HOSTNAME}"'
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
       networks:
         - airflow
     airflow-init:
       <<: *airflow-common
       environment:
         <<: *airflow-common-env
       container_name: airflow.init
       command: db migrate
       #command: db init
       networks:
         - airflow
     flower:
       <<: *airflow-common
       container_name: airflow.flower
       command: celery flower
       ports:
         - 5555:5555
       healthcheck:
         test: ["CMD", "curl", "--fail", "http://localhost:5555/";]
         interval: 10s
         timeout: 10s
         retries: 5
       restart: always
       networks:
         - airflow
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to