Anvarshahith commented on issue #13668:
URL: https://github.com/apache/airflow/issues/13668#issuecomment-1480908959

   Hi Team, I am also facing error related to deadlock 
(_mysql_exceptions.OperationalError) (1213, 'Deadlock found when trying to get 
lock; try restarting transaction'))  on my airflow sechduler 
   
   Airflow version: 1.10.10
   
   logs 
   --------------
   [2023-03-22 03:33:25,419] {taskinstance.py:1145} ERROR - 
(_mysql_exceptions.OperationalError) (1213, 'Deadlock found when trying to get 
lock; try restarting transaction')
   [SQL: UPDATE task_instance SET state=%s, queued_dttm=%s WHERE 
task_instance.task_id = %s AND task_instance.dag_id = %s AND 
task_instance.execution_date = %s]
   [parameters: ('queued', datetime.datetime(2023, 3, 22, 3, 33, 17, 506127), 
'spark_job_etl_gb-monthly-app-consumption', 'etl_pipeline.cluster_A', 
datetime.datetime(2023, 3, 21, 1, 0))]
   (Background on this error at: http://sqlalche.me/e/e3q8)
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 1248, in _execute_context
       cursor, statement, parameters, context
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py",
 line 588, in do_execute
       cursor.execute(statement, parameters)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
255, in execute
       self.errorhandler(self, exc, value)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 
50, in defaulterrorhandler
       raise errorvalue
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
252, in execute
       res = self._query(query)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
378, in _query
       db.query(q)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 
280, in query
       _mysql.connection.query(self, query)
   _mysql_exceptions.OperationalError: (1213, 'Deadlock found when trying to 
get lock; try restarting transaction')
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 983, in _run_raw_task
       result = task_copy.execute(context=context)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/operators/subdag_operator.py",
 line 102, in execute
       executor=self.executor)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dag.py", line 
1415, in run
       job.run()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/base_job.py", 
line 221, in run
       self._execute()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 
74, in wrapper
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py",
 line 788, in _execute
       session=session)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 
70, in wrapper
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py",
 line 718, in _execute_for_run_dates
       session=session)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 
70, in wrapper
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py",
 line 587, in _process_backfill_task_instances
       _per_task_process(task, key, ti)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/db.py", line 
74, in wrapper
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/jobs/backfill_job.py",
 line 509, in _per_task_process
       session.commit()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", 
line 1036, in commit
       self.transaction.commit()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", 
line 503, in commit
       self._prepare_impl()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", 
line 482, in _prepare_impl
       self.session.flush()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", 
line 2496, in flush
       self._flush(objects)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", 
line 2637, in _flush
       transaction.rollback(_capture_exception=True)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py",
 line 69, in __exit__
       exc_value, with_traceback=exc_tb,
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", 
line 178, in raise_
       raise exception
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", 
line 2597, in _flush
       flush_context.execute()
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py",
 line 422, in execute
       rec.execute(self)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py",
 line 589, in execute
       uow,
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py",
 line 236, in save_obj
       update,
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py",
 line 995, in _emit_update_statements
       statement, multiparams
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 984, in execute
       return meth(self, multiparams, params)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", 
line 293, in _execute_on_connection
       return connection._execute_clauseelement(self, multiparams, params)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 1103, in _execute_clauseelement
       distilled_params,
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 1288, in _execute_context
       e, statement, parameters, cursor, context
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 1482, in _handle_dbapi_exception
       sqlalchemy_exception, with_traceback=exc_info[2], from_=e
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", 
line 178, in raise_
       raise exception
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", 
line 1248, in _execute_context
       cursor, statement, parameters, context
     File 
"/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py",
 line 588, in do_execute
       cursor.execute(statement, parameters)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
255, in execute
       self.errorhandler(self, exc, value)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 
50, in defaulterrorhandler
       raise errorvalue
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
252, in execute
       res = self._query(query)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 
378, in _query
       db.query(q)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/MySQLdb/connections.py", line 
280, in query
       _mysql.connection.query(self, query)
   sqlalchemy.exc.OperationalError: (_mysql_exceptions.OperationalError) (1213, 
'Deadlock found when trying to get lock; try restarting transaction')
   [SQL: UPDATE task_instance SET state=%s, queued_dttm=%s WHERE 
task_instance.task_id = %s AND task_instance.dag_id = %s AND 
task_instance.execution_date = %s]
   [parameters: ('queued', datetime.datetime(2023, 3, 22, 3, 33, 17, 506127), 
'spark_job_etl_gb-monthly-app-consumption', 'etl_pipeline.cluster_A', 
datetime.datetime(2023, 3, 21, 1, 0))]
   (Background on this error at: http://sqlalche.me/e/e3q8)
   [2023-03-22 03:33:25,425] {taskinstance.py:1202} INFO - Marking task as 
FAILED.dag_id=etl_pipeline, task_id=cluster_A, execution_date=20230321T010000, 
start_date=20230322T010043, end_date=20230322T033325
   [2023-03-22 03:33:27,896] {logging_mixin.py:112} INFO - [2023-03-22 
03:33:27,896] {local_task_job.py:103} INFO - Task exited with return code 1
   ------


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to