[
https://issues.apache.org/jira/browse/AIRFLOW-6250?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16998699#comment-16998699
]
ASF subversion and git services commented on AIRFLOW-6250:
----------------------------------------------------------
Commit a3fe79774e409ae17cf70c5e6e550504730d3138 in airflow's branch
refs/heads/v1-10-test from yuqian90
[ https://gitbox.apache.org/repos/asf?p=airflow.git;h=a3fe79774 ]
[AIRFLOW-6250] Ensure on_failure_callback always has a populate context (#6812)
(cherry-picked from 1006740aa92d584cfb0317c922184ef758bf108a)
> on_failure_callback does not know the task_id when handle_failure() is called
> without passing context
> -----------------------------------------------------------------------------------------------------
>
> Key: AIRFLOW-6250
> URL: https://issues.apache.org/jira/browse/AIRFLOW-6250
> Project: Apache Airflow
> Issue Type: Improvement
> Components: scheduler
> Affects Versions: 1.10.6
> Reporter: Qian Yu
> Assignee: Qian Yu
> Priority: Major
> Fix For: 1.10.7
>
>
> The following code in scheduler_job.py can be hit e.g when
> {{send_task_to_executor()}} in celery_executor.py is too slow and times out
> after 2 seconds. But this call to {{handle_failure}}() is not passing a
> {{context}} object.
> So the {{on_failure_callback}} and on_retry_callback of tasks don't have an
> idea about what task failed.
>
> This can be fixed by making handle_failure() derive some reasonable default
> value for the context argument.
>
> {code:python}
> simple_dag = simple_dag_bag.get_dag(dag_id)
> dagbag = models.DagBag(simple_dag.full_filepath)
> dag = dagbag.get_dag(dag_id)
> ti.task = dag.get_task(task_id)
> ti.handle_failure(msg)
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)