stablum edited a comment on issue #19957:
URL: https://github.com/apache/airflow/issues/19957#issuecomment-994343254
unfortunately, this keeps happening (after couple of weeks where it was
running smoothly)
```
[2021-12-15 01:54:30,915] {dagbag.py:500} INFO - Filling up the DagBag from
/root/learning_sets/models/
dag_bag <airflow.models.dagbag.DagBag object at 0x7f56aa88cf70>
Running <TaskInstance: download_and_preprocess_sets.download_1466
manual__2021-12-14T06:28:19.872227+00:00 [queued]> on host AI-Research
Running <TaskInstance: download_and_preprocess_sets.download_952
manual__2021-12-14T06:28:19.872227+00:00 [queued]> on host AI-Research
[2021-12-15 01:54:43,539] {scheduler_job.py:644} ERROR - Exception when
executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1276, in _execute_context
self.dialect.do_execute(
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/default.py", line
608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.DeadlockDetected: deadlock detected
DETAIL: Process 1117623 waits for ShareLock on transaction 4526903; blocked
by process 1206850.
Process 1206850 waits for AccessExclusiveLock on tuple (1,17) of relation
19255 of database 19096; blocked by process 1206469.
Process 1206469 waits for ShareLock on transaction 4526895; blocked by
process 1117623.
HINT: See server log for query details.
CONTEXT: while updating tuple (1899,3) in relation "task_instance"
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
628, in _execute
self._run_scheduler_loop()
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
709, in _run_scheduler_loop
num_queued_tis = self._do_scheduling(session)
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
792, in _do_scheduling
callback_to_run = self._schedule_dag_run(dag_run, session)
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
1049, in _schedule_dag_run
dag_run.schedule_tis(schedulable_tis, session)
File "/usr/local/lib/python3.9/dist-packages/airflow/utils/session.py",
line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/airflow/models/dagrun.py",
line 898, in schedule_tis
session.query(TI)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/query.py",
line 4063, in update
update_op.exec_()
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line
1697, in exec_
self._do_exec()
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line
1895, in _do_exec
self._execute_stmt(update_stmt)
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line
1702, in _execute_stmt
self.result = self.query._execute_crud(stmt, self.mapper)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/query.py",
line 3568, in _execute_crud
return conn.execute(stmt, self._params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1011, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/sql/elements.py",
line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1124, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1316, in _execute_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1510, in _handle_dbapi_exception
util.raise_(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/util/compat.py",
line 182, in raise_
raise exception
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1276, in _execute_context
self.dialect.do_execute(
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/default.py", line
608, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (psycopg2.errors.DeadlockDetected) deadlock
detected
DETAIL: Process 1117623 waits for ShareLock on transaction 4526903; blocked
by process 1206850.
Process 1206850 waits for AccessExclusiveLock on tuple (1,17) of relation
19255 of database 19096; blocked by process 1206469.
Process 1206469 waits for ShareLock on transaction 4526895; blocked by
process 1117623.
HINT: See server log for query details.
CONTEXT: while updating tuple (1899,3) in relation "task_instance"
[SQL: UPDATE task_instance SET state=%(state)s WHERE task_instance.dag_id =
%(dag_id_1)s AND task_instance.run_id = %(run_id_1)s AND task_instance.task_id
IN (%(task_id_1)s, %(task_id_2)s, %(task_id_3)s, %(task_id_4)s, %(task_id_5)s,
%(task_id_6)s, %(task_id_7)s)]
[parameters: {'state': <TaskInstanceState.SCHEDULED: 'scheduled'>,
'dag_id_1': 'download_and_preprocess_sets', 'run_id_1':
'manual__2021-12-14T06:28:19.872227+00:00', 'task_id_1': 'download_936',
'task_id_2': 'download_937', 'task_id_3': 'download_938', 'task_id_4':
'download_939', 'task_id_5': 'download_944', 'task_id_6': 'download_946',
'task_id_7': 'download_950'}]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
[2021-12-15 01:54:43,544] {local_executor.py:388} INFO - Shutting down
LocalExecutor; waiting for running tasks to finish. Signal again if you don't
want to wait.
[2021-12-15 01:54:53,885] {process_utils.py:100} INFO - Sending
Signals.SIGTERM to GPID 1117612
[2021-12-15 01:54:54,098] {process_utils.py:66} INFO - Process
psutil.Process(pid=1117612, status='terminated', exitcode=0,
started='01:38:54') (1117612) terminated with exit code 0
[2021-12-15 01:54:54,098] {scheduler_job.py:655} INFO - Exited execute loop
[2021-12-15 01:54:54 +0100] [1117543] [INFO] Handling signal: term
[2021-12-15 01:54:54 +0100] [1117568] [INFO] Worker exiting (pid: 1117568)
[2021-12-15 01:54:54 +0100] [1117549] [INFO] Worker exiting (pid: 1117549)
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1276, in _execute_context
self.dialect.do_execute(
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/default.py", line
608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.DeadlockDetected: deadlock detected
DETAIL: Process 1117623 waits for ShareLock on transaction 4526903; blocked
by process 1206850.
Process 1206850 waits for AccessExclusiveLock on tuple (1,17) of relation
19255 of database 19096; blocked by process 1206469.
Process 1206469 waits for ShareLock on transaction 4526895; blocked by
process 1117623.
HINT: See server log for query details.
CONTEXT: while updating tuple (1899,3) in relation "task_instance"
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.9/dist-packages/airflow/__main__.py", line
48, in main
args.func(args)
File "/usr/local/lib/python3.9/dist-packages/airflow/cli/cli_parser.py",
line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/airflow/utils/cli.py", line
92, in wrapper
return f(*args, **kwargs)
File
"/usr/local/lib/python3.9/dist-packages/airflow/cli/commands/scheduler_command.py",
line 75, in scheduler
_run_scheduler_job(args=args)
File
"/usr/local/lib/python3.9/dist-packages/airflow/cli/commands/scheduler_command.py",
line 46, in _run_scheduler_job
job.run()
File "/usr/local/lib/python3.9/dist-packages/airflow/jobs/base_job.py",
line 245, in run
self._execute()
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
628, in _execute
self._run_scheduler_loop()
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
709, in _run_scheduler_loop
num_queued_tis = self._do_scheduling(session)
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
792, in _do_scheduling
callback_to_run = self._schedule_dag_run(dag_run, session)
File
"/usr/local/lib/python3.9/dist-packages/airflow/jobs/scheduler_job.py", line
1049, in _schedule_dag_run
dag_run.schedule_tis(schedulable_tis, session)
File "/usr/local/lib/python3.9/dist-packages/airflow/utils/session.py",
line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/airflow/models/dagrun.py",
line 898, in schedule_tis
session.query(TI)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/query.py",
line 4063, in update
update_op.exec_()
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line
1697, in exec_
self._do_exec()
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line
1895, in _do_exec
self._execute_stmt(update_stmt)
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/persistence.py", line
1702, in _execute_stmt
self.result = self.query._execute_crud(stmt, self.mapper)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/orm/query.py",
line 3568, in _execute_crud
return conn.execute(stmt, self._params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1011, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/sql/elements.py",
line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1124, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1316, in _execute_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1510, in _handle_dbapi_exception
util.raise_(
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/util/compat.py",
line 182, in raise_
raise exception
File "/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/base.py",
line 1276, in _execute_context
self.dialect.do_execute(
File
"/usr/local/lib/python3.9/dist-packages/sqlalchemy/engine/default.py", line
608, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (psycopg2.errors.DeadlockDetected) deadlock
detected
DETAIL: Process 1117623 waits for ShareLock on transaction 4526903; blocked
by process 1206850.
Process 1206850 waits for AccessExclusiveLock on tuple (1,17) of relation
19255 of database 19096; blocked by process 1206469.
Process 1206469 waits for ShareLock on transaction 4526895; blocked by
process 1117623.
HINT: See server log for query details.
CONTEXT: while updating tuple (1899,3) in relation "task_instance"
[SQL: UPDATE task_instance SET state=%(state)s WHERE task_instance.dag_id =
%(dag_id_1)s AND task_instance.run_id = %(run_id_1)s AND task_instance.task_id
IN (%(task_id_1)s, %(task_id_2)s, %(task_id_3)s, %(task_id_4)s, %(task_id_5)s,
%(task_id_6)s, %(task_id_7)s)]
[parameters: {'state': <TaskInstanceState.SCHEDULED: 'scheduled'>,
'dag_id_1': 'download_and_preprocess_sets', 'run_id_1':
'manual__2021-12-14T06:28:19.872227+00:00', 'task_id_1': 'download_936',
'task_id_2': 'download_937', 'task_id_3': 'download_938', 'task_id_4':
'download_939', 'task_id_5': 'download_944', 'task_id_6': 'download_946',
'task_id_7': 'download_950'}]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
[2021-12-15 01:54:54 +0100] [1117543] [INFO] Shutting down: Master
```
I just restarted it and this are the scheduler processes running: I launched
it with `screen`:
```
root@AI-Research:~# ps aux | grep airflow | grep sched | grep -v grep
root 3166838 0.0 0.0 9024 2460 ? Ss 07:17 0:00 SCREEN -L
-Logfile logs/airflow_scheduler_20211215_071716.log -S airflow_scheduler -d -m
airflow scheduler
root 3166845 56.1 1.0 212516 179704 pts/66 Rs+ 07:17 0:16
/usr/bin/python3 /usr/local/bin/airflow scheduler
root 3167171 0.5 0.4 111728 74756 pts/66 S 07:17 0:00 airflow
scheduler -- DagFileProcessorManager
```
And this is the launching script:
```
root@AI-Research:~/learning_sets/airflow# cat launch_airflow.sh
#!/bin/bash
TS=$(date +%Y%m%d_%H%M%S)
screen -X -S airflow_scheduler quit
screen -X -S airflow_webserver quit
sleep 1
ps aux | grep airflow | grep -v launch | awk '{print $2}' | xargs kill
sleep 1
screen -L -Logfile logs/airflow_scheduler_${TS}.log -S airflow_scheduler -d
-m airflow scheduler
screen -L -Logfile logs/airflow_webserver_${TS}.log -S airflow_webserver -d
-m airflow webserver
```
Responding to your questions:
* no other process or script is using the PostgreSQL db at the moment
* The last two queries will be added in my next comment
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]