[ 
https://issues.apache.org/jira/browse/AIRFLOW-5071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17326169#comment-17326169
 ] 

ASF GitHub Bot commented on AIRFLOW-5071:
-----------------------------------------

christine-le commented on issue #10790:
URL: https://github.com/apache/airflow/issues/10790#issuecomment-823683394


   Posting this here in case anyone else finds it helpful. We recently upgraded 
to Airflow 2.0.1 and were getting the "was the task killed externally?" error 
our scheduler logs for one of our dags:
    
   
   ```
   [2021-04-20 20:55:10,768] {{scheduler_job.py:1235}} ERROR - Executor reports 
task instance <TaskInstance: dag_foo_bar.task-foo-bar 2021-04-20 20:54:00+00:00 
[queued]> finished (failed) although the task says its queued. (Info: Celery 
command failed on host: ip-10-0-0-113.ec2.internal) Was the task killed 
externally?
   [2021-04-20 20:56:02,594] {{scheduler_job.py:1235}} ERROR - Executor reports 
task instance <TaskInstance: dag_foo_bar.task-foo-bar 2021-04-20 20:55:00+00:00 
[queued]> finished (failed) although the task says its queued. (Info: None) Was 
the task killed externally?
   ```
   
   Searching through our Celery worker logs, I found a more accurate error 
message, indicating our dag could not be found:
   
   ```
   [2021-04-20 20:54:28,846: INFO/ForkPoolWorker-15] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'dag_foo_bar', 'task-foo-bar', 
'2021-04-16T18:57:00+00:00', '--local', '--pool', 'default_pool', '--subdir', 
'/efs/airflow/dags/our_dags.py']
   [2021-04-20 20:54:29,109: ERROR/ForkPoolWorker-15] Failed to execute task 
dag_id could not be found: dag_foo_bar. Either the dag did not exist or it 
failed to parse..
   ```
   
   In this case, it was a small/silly mistake where we accidentally deleted the 
dag. Easy fix, but the initial error was a little misleading and threw us off.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Thousand os Executor reports task instance X finished (success) although the 
> task says its queued. Was the task killed externally?
> ----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-5071
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5071
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DAG, scheduler
>    Affects Versions: 1.10.3
>            Reporter: msempere
>            Priority: Critical
>             Fix For: 1.10.12
>
>         Attachments: image-2020-01-27-18-10-29-124.png, 
> image-2020-07-08-07-58-42-972.png
>
>
> I'm opening this issue because since I update to 1.10.3 I'm seeing thousands 
> of daily messages like the following in the logs:
>  
> ```
>  {{__init__.py:1580}} ERROR - Executor reports task instance <TaskInstance: X 
> 2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> {{jobs.py:1484}} ERROR - Executor reports task instance <TaskInstance: X 
> 2019-07-29 00:00:00+00:00 [queued]> finished (success) although the task says 
> its queued. Was the task killed externally?
> ```
> -And looks like this is triggering also thousand of daily emails because the 
> flag to send email in case of failure is set to True.-
> I have Airflow setup to use Celery and Redis as a backend queue service.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to