ginwakeup commented on issue #31972:
URL: https://github.com/apache/airflow/issues/31972#issuecomment-1595366827
@jens-scheffler-bosch forgive me, that's not the scheduler logs, it's the
triggerer.
The scheduler errors like this:
```
[2023-06-16T22:01:08.483+0000] {settings.py:407} DEBUG - Disposing DB
connection pool (PID 6602)
Process DagFileProcessor910-Process:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/multiprocessing/process.py", line 297, in
_bootstrap
self.run()
File "/usr/local/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/dag_processing/processor.py",
line 174, in _run_file_processor
_handle_dag_file_processing()
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/dag_processing/processor.py",
line 158, in _handle_dag_file_processing
callback_requests=callback_requests,
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py",
line 75, in wrapper
return func(*args, session=session, **kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/dag_processing/processor.py",
line 768, in process_file
dagbag.sync_to_db(processor_subdir=self._dag_directory, session=session)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py",
line 72, in wrapper
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/dagbag.py",
line 645, in sync_to_db
for attempt in run_with_db_retries(logger=self.log):
File
"/home/airflow/.local/lib/python3.7/site-packages/tenacity/__init__.py", line
384, in __iter__
do = self.iter(retry_state=retry_state)
File
"/home/airflow/.local/lib/python3.7/site-packages/tenacity/__init__.py", line
351, in iter
return fut.result()
File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 428, in
result
return self.__get_result()
File "/usr/local/lib/python3.7/concurrent/futures/_base.py", line 384, in
__get_result
raise self._exception
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/dagbag.py",
line 660, in sync_to_db
self.dags.values(), processor_subdir=processor_subdir, session=session
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py",
line 72, in wrapper
return func(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.7/site-packages/airflow/models/dag.py", line
2687, in bulk_write_to_db
orm_dags: list[DagModel] = with_row_locks(query, of=DagModel,
session=session).all()
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
line 2772, in all
return self._iter().all()
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/query.py",
line 2918, in _iter
execution_options={"_sa_orm_load_options": self.load_options},
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py",
line 1713, in execute
conn = self._connection_for_bind(bind)
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py",
line 1553, in _connection_for_bind
engine, execution_options
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py",
line 721, in _connection_for_bind
self._assert_active()
File
"/home/airflow/.local/lib/python3.7/site-packages/sqlalchemy/orm/session.py",
line 608, in _assert_active
code="7s2a",
sqlalchemy.exc.PendingRollbackError: This Session's transaction has been
rolled back due to a previous exception during flush. To begin a new
transaction with this Session, first issue Session.rollback(). Original
exception was: (psycopg2.errors.UniqueViolation) duplicate key value violates
unique constraint "serialized_dag_pkey"
DETAIL: Key (dag_id)=(***) already exists.
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]