dipayan80 opened a new issue #9656: URL: https://github.com/apache/airflow/issues/9656
**Apache Airflow version**: 10.10 , also tested on 10.7 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: Amazon Linux EC2 , webserver/scheduler/worker on separate nodes , Redis queue - **Cloud provider or hardware configuration**: AWS - **OS** (e.g. from /etc/os-release): Amazon Linux - **Kernel** (e.g. `uname -a`): - **Install tools**: - **Others**: **What happened**: Webserver times out when trying to run a task by ignoring all dependencies from the UI  Airflow logs show that it actually tried to add the task to the queue `Jul 4 20:02:35 airflow: [2020-07-04 20:02:35,306] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'my_dag', 'my_task_to_run', '2020-07-02T05:30:00+00:00', '-A', '-i', '--force', '--local', '--pool', u'default_pool', '-sd', 'DAGS_FOLDER/<path to dag>` It will remain stuck on that screen and time out eventually throwing a 504 in the browser . Nothing else shows up in the logs other than the message above . Also noted that one gunicorn webserver worker will get replaced , so possibly the one that got stuck in timeout <!-- (please include exact error messages if you can) --> **What you expected to happen**: The UI should not timeout and should successfully add the task to queue <!-- What do you think went wrong? --> **How to reproduce it**: Happens everytime we try to run task by removing dependencies <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md sytle of  To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**: ** What do you think went wrong?**: Ran the webserver in debug mode with -d option , then i get the following errors `Jul 3 14:59:34 airflow: Traceback (most recent call last): Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 120, in send_task_to_executor Jul 3 14:59:34 airflow: result = task.apply_async(args=[command], queue=queue) Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/celery/app/task.py", line 568, in apply_async Jul 3 14:59:34 airflow: **options Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/celery/app/base.py", line 780, in send_task Jul 3 14:59:34 airflow: amqp.send_task_message(P, name, message, **options) Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/celery/app/amqp.py", line 559, in send_task_message Jul 3 14:59:34 airflow: **properties Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/messaging.py", line 181, in publish Jul 3 14:59:34 airflow: exchange_name, declare, Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/connection.py", line 533, in _ensured Jul 3 14:59:34 airflow: return fun(*args, **kwargs) Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/messaging.py", line 187, in _publish Jul 3 14:59:34 airflow: channel = self.channel Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/messaging.py", line 209, in _get_channel Jul 3 14:59:34 airflow: channel = self._channel = channel() Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/utils/functional.py", line 45, in __call__ Jul 3 14:59:34 airflow: value = self.__value__ = self.__contract__() Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/messaging.py", line 224, in <lambda> Jul 3 14:59:34 airflow: channel = ChannelPromise(lambda: connection.default_channel) Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/connection.py", line 892, in default_channel Jul 3 14:59:34 airflow: self._ensure_connection(**conn_opts) Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/connection.py", line 445, in _ensure_connection Jul 3 14:59:34 airflow: callback, timeout=timeout Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/kombu/utils/functional.py", line 358, in retry_over_time Jul 3 14:59:34 airflow: sleep(1.0) Jul 3 14:59:34 airflow: File "/var/lib/airflow/venv/lib/python2.7/site-packages/airflow/utils/timeout.py", line 43, in handle_timeout Jul 3 14:59:34 airflow: raise AirflowTaskTimeout(self.error_message) Jul 3 14:59:34 airflow: AirflowTaskTimeout: Timeout, PID: 5667 Jul 3 14:59:34 airflow: [2020-07-03 14:59:34,278] {celery_executor.py:226} ERROR - Error sending Celery task:Timeout, PID: 5667` Briefly tested a docker image of Airflow available publicly and could not reproduce the error , the only major difference between a vanilla airflow docker and our setup is probably the Redis queue ? <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> --> ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
