[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17167821#comment-17167821
 ] 

Sudip Datta commented on AIRFLOW-5071:
--------------------------------------

[~turbaszek] [~kaxilnaik]

We are also facing a similar situation with some of our DAGs (we have around 
150 DAGs). We were on 1.10.3 and occasionally saw this error, however while 
trying to upgrade to 1.10.10, the problem is significantly accentuated (the 
issue is there even with 1 DAG having a BashOperator). We use celery workers 
and a managed Redis queue.

Weirdly, it seems to be affecting some of our DAG's while others continue to 
run smoothly. This includes a simple enough DAG that just uses BashOperator and 
runs a shell script. It runs once a day. However there is another DAG which 
also only has a BashOperator which runs every hour and is unaffected. There are 
other jobs which use SshOperator that our affected.

We are using the official Docker image of Airflow 
apache/airflow:1.10.10-python3.6

Will be willing to provide any other details that might help, either here or on 
Slack, as this is a major blocker for us to upgrade.

> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> ----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-5071
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DAG, scheduler
>    Affects Versions: 1.10.3
>            Reporter: msempere
>            Priority: Critical
>             Fix For: 1.10.12
>
>         Attachments: image-2020-01-27-18-10-29-124.png, 
> image-2020-07-08-07-58-42-972.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance <TaskInstance: X 
> 2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance <TaskInstance: X 
> 2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to