Dundo777 opened a new issue #13529:
URL: https://github.com/apache/airflow/issues/13529
**Apache Airflow version**:
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): CentOS 7
- **Kernel** (e.g. `uname -a`): x86_64
- **Install tools**: yum, pip
- **Others**: MariaDB 10.2, Python 3.6.8, RabbitMQ 3.3.5
**What happened**:
After upgrading to Airflow 2.0.0, scheduler ends up in a deadlock when
running a DAG.
When I trigger a DAG (or just run the task) scheduler adds the task to the
queue, worker picks up the task and successfully executes it but eventually
scheduler ends up in a deadlock and marks run as failed.
In database there are no deadlocks and everything seems to be working fine.
<details>
<summary>
Scheduler logs
</summary>
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,882]
{scheduler_job.py:938} INFO - 1 tasks up for execution:
Jan 07 07:43:10 host bash[9738]: <TaskInstance: dbimport.start 2021-01-07
06:43:10.874823+00:00 [scheduled]>
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,885]
{scheduler_job.py:972} INFO - Figuring out tasks to run in Pool(name=dbimport)
with 24 open slots and 1 task instances ready to be queued
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,885]
{scheduler_job.py:999} INFO - DAG dbimport has 0/4 running and queued tasks
Jan 07 07:43:10host bash[9738]: [2021-01-07 07:43:10,885]
{scheduler_job.py:1060} INFO - Setting the following tasks to queued state:
Jan 07 07:43:10 host bash[9738]: <TaskInstance: dbimport.start 2021-01-07
06:43:10.874823+00:00 [scheduled]>
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,888]
{scheduler_job.py:1102} INFO - Sending TaskInstanceKey(dag_id='dbimport',
task_id='start', execution_date=datetime.datetime(2021, 1, 7, 6, 43, 10,
874823, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 36 and
queue default
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,888]
{base_executor.py:79} INFO - Adding to queue: ['airflow', 'tasks', 'run',
'dbimport', 'start', '2021-01-07T06:43:10.874823+00:00', '--local', '--pool',
'dbimport', '--subdir',
'/usr/local/airflow/dags/bigdata-internal-poc/dbimport.py']
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,922]
{scheduler_job.py:1200} INFO - Executor reports execution of dbimport.start
execution_date=2021-01-07 06:43:10.874823+00:00 exited with status queued for
try_number 1
Jan 07 07:43:10 host bash[9738]: [2021-01-07 07:43:10,955]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:11 host bash[9738]: [2021-01-07 07:43:11,994]
{scheduler_job.py:1671} INFO - DAG dbimport already has 2 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:13 host bash[9738]: [2021-01-07 07:43:13,041]
{scheduler_job.py:1671} INFO - DAG dbimport already has 2 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:14 host bash[9738]: [2021-01-07 07:43:14,087]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:14 host bash[9738]: [2021-01-07 07:43:14,102]
{scheduler_job.py:1200} INFO - Executor reports execution of dbimport.start
execution_date=2021-01-07 06:43:10.874823+00:00 exited with status success for
try_number 1
Jan 07 07:43:14 host bash[9738]: [2021-01-07 07:43:14,132]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:15 host bash[9738]: [2021-01-07 07:43:15,169]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:15 host bash[9738]: [2021-01-07 07:43:15,956]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:17 host bash[9738]: [2021-01-07 07:43:17,001]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:18 host bash[9738]: [2021-01-07 07:43:18,032]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:19 host bash[9738]: [2021-01-07 07:43:19,062]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:19 host bash[9738]: [2021-01-07 07:43:19,854]
{scheduler_job.py:1671} INFO - DAG dbimport already has 1 active runs, not
queuing any tasks for run 2021-01-07 06:43:09.837529+00:00
Jan 07 07:43:19 host bash[9738]: [2021-01-07 07:43:19,878]
{scheduler_job.py:853} WARNING - Set 1 task instances to state=None as their
associated DagRun was not in RUNNING state
Jan 07 07:43:20 host bash[9738]: [2021-01-07 07:43:20,920] {dagrun.py:459}
ERROR - Deadlock; marking run <DagRun dbimport @ 2021-01-07
06:43:09.837529+00:00: manual__2021-01-07T06:43:09.837529+00:00, externally
triggered: True> failed
</details>
<details>
<summary>
Worker logs
</summary>
Jan 07 07:43:10 host bash[8564]: [2021-01-07 07:43:10,935:
INFO/MainProcess] Received task:
airflow.executors.celery_executor.execute_command[6bc6dc4e-662e-4e4c-89af-50f070e21d9c]
Jan 07 07:43:11 host bash[8564]: [2021-01-07 07:43:11,077:
INFO/ForkPoolWorker-8] Executing command in Celery: ['airflow', 'tasks', 'run',
'dbimport', 'start', '2021-01-07T06:43:10.874823+00:00', '--local', '--pool',
'dbimport', '--subdir',
'/usr/local/airflow/dags/bigdata-internal-poc/dbimport.py']
Jan 07 07:43:11host bash[8564]: [2021-01-07 07:43:11,198:
INFO/ForkPoolWorker-8] Filling up the DagBag from
/usr/local/airflow/dags/bigdata-internal-poc/dbimport.py
Jan 07 07:43:11 host bash[8564]: [2021-01-07 07:43:11,543:
WARNING/ForkPoolWorker-8] Running <TaskInstance: dbimport.start
2021-01-07T06:43:10.874823+00:00 [None]> on host analytics-dn3.lan.croz.net
Jan 07 07:43:11 host sudo[12294]: airflow : TTY=unknown ;
PWD=/tmp/airflowtmp258n1mmv ; USER=dbimport ; COMMAND=/bin/bash -c
/usr/local/dbimport/bin/manage --checkAirflowExecution
Jan 07 07:43:13 host bash[8564]: [2021-01-07 07:43:13,431:
INFO/ForkPoolWorker-8] Task
airflow.executors.celery_executor.execute_command[6bc6dc4e-662e-4e4c-89af-50f070e21d9c]
succeeded in 2.466554159997031s: None
</details>
**What you expected to happen**:
Scheduler should mark task as succeeded and not end up in a deadlock.
**How to reproduce it**:
Upgrade from Airflow 1.10.14 to 2.0.0, with MariaDB 10.2, trigger the DAG or
task.
**Anything else we need to know**:
Problem occurs every time a DAG or task is triggered. This is critical
because now we can't run anything.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]