ephraimbuddy commented on code in PR #35857:
URL: https://github.com/apache/airflow/pull/35857#discussion_r1405348580
##########
airflow/jobs/scheduler_job_runner.py:
##########
@@ -1577,15 +1577,18 @@ def _fail_tasks_stuck_in_queued(self, session: Session
= NEW_SESSION) -> None:
)
).all()
try:
- tis_for_warning_message =
self.job.executor.cleanup_stuck_queued_tasks(tis=tasks_stuck_in_queued)
- if tis_for_warning_message:
- task_instance_str = "\n\t".join(tis_for_warning_message)
- self.log.warning(
- "Marked the following %s task instances stuck in queued as
failed. "
- "If the task instance has available retries, it will be
retried.\n\t%s",
- len(tasks_stuck_in_queued),
- task_instance_str,
- )
+ cleaned_up_task_instances =
self.job.executor.cleanup_stuck_queued_tasks(
+ tis=tasks_stuck_in_queued
+ )
+ cleaned_up_task_instances = set(cleaned_up_task_instances)
+ for ti in tasks_stuck_in_queued:
+ if repr(ti) in cleaned_up_task_instances:
+ self._task_context_logger.error(
+ "Marking task instance %s stuck in queued as failed. "
+ "If the task instance has available retries, it will
be retried.",
+ ti,
+ ti=ti,
Review Comment:
Looks good but we would lose this information if the user disables the
feature. I think both scheduler and task should have the information for the
time being.
Another thing that worries me is the performance implication but in theory,
I don't think there would be many tasks stuck in queued state.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]