christine-le commented on issue #10790:
URL: https://github.com/apache/airflow/issues/10790#issuecomment-823683394


   Posting this here in case anyone else finds it helpful. We recently upgraded 
to Airflow 2.0.1 and were getting the "was the task killed externally?" error 
our scheduler logs for one of our dags:
    
   
   ```
   [2021-04-20 20:55:10,768] {{scheduler_job.py:1235}} ERROR - Executor reports 
task instance <TaskInstance: dag_foo_bar.task-foo-bar 2021-04-20 20:54:00+00:00 
[queued]> finished (failed) although the task says its queued. (Info: Celery 
command failed on host: ip-10-0-0-113.ec2.internal) Was the task killed 
externally?
   [2021-04-20 20:56:02,594] {{scheduler_job.py:1235}} ERROR - Executor reports 
task instance <TaskInstance: dag_foo_bar.task-foo-bar 2021-04-20 20:55:00+00:00 
[queued]> finished (failed) although the task says its queued. (Info: None) Was 
the task killed externally?
   ```
   
   Searching through our Celery worker logs, I found a more accurate error 
message, indicating our dag could not be found:
   
   ```
   [2021-04-20 20:54:28,846: INFO/ForkPoolWorker-15] Executing command in 
Celery: ['airflow', 'tasks', 'run', 'dag_foo_bar', 'task-foo-bar', 
'2021-04-16T18:57:00+00:00', '--local', '--pool', 'default_pool', '--subdir', 
'/efs/airflow/dags/our_dags.py']
   [2021-04-20 20:54:29,109: ERROR/ForkPoolWorker-15] Failed to execute task 
dag_id could not be found: dag_foo_bar. Either the dag did not exist or it 
failed to parse..
   ```
   
   In this case, it was a small/silly mistake where we accidentally deleted the 
dag. Easy fix, but the initial error was a little misleading and threw us off.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to