bhavaniravi commented on issue #23411:
URL: https://github.com/apache/airflow/issues/23411#issuecomment-1121377193

   Noticed this behavior for a customer for a custom operator. I could recreate 
it in Astronomer Dev using a simple Python sensor. Contrary to the original 
ticket I can also see ` Executor reports task instance finished (success) 
although the task says its queued. (Info: None) Was the task killed externally?`
   
   **Dag Code**
   
   ```
   from airflow.sensors.python import PythonSensor
   import time, random
   from datetime import datetime
   import os
   from airflow import DAG
   
   def _partner_b():
       time.sleep(10)
       r = random.random()
       print ("Partner B: {}".format(r))
       return r > 0.8
       
   dag = DAG(dag_id=os.path.basename(__file__).replace(".py", ""),
             start_date=datetime(2021, 9, 22),
             default_args={},
             schedule_interval='*/5 * * * *',
             catchup=False)
   
   partner_b = PythonSensor(
       task_id='partner_b',
       poke_interval=60 * 2,
       mode="reschedule",
       python_callable=_partner_b,
       dag=dag,
       weight_rule="absolute"
   )
   
   ```
   
   **Log Trace**
   ---
   ```
   [2022-05-09 17:20:11,228] {scheduler_job.py:543} INFO - Sending 
TaskInstanceKey(dag_id='sensor_example', task_id='partner_b', 
run_id='scheduled__2022-05-09T17:15:00+00:00', try_number=1, map_index=-1) to 
executor with priority 1 and queue default
   [2022-05-09 17:20:11,228] {base_executor.py:91} INFO - Adding to queue: 
['airflow', 'tasks', 'run', 'sensor_example', 'partner_b', 
'scheduled__2022-05-09T17:15:00+00:00', '--local', '--subdir', 
'DAGS_FOLDER/sensor_example.py']
   [2022-05-09 17:20:11,230] {base_executor.py:211} INFO - task 
TaskInstanceKey(dag_id='sensor_example', task_id='partner_b', 
run_id='scheduled__2022-05-09T17:15:00+00:00', try_number=1, map_index=-1) is 
still running
   [2022-05-09 17:20:11,230] {scheduler_job.py:596} INFO - Executor reports 
execution of sensor_example.partner_b 
run_id=scheduled__2022-05-09T17:15:00+00:00 exited with status success for 
try_number 1
   [2022-05-09 17:20:11,235] {scheduler_job.py:640} INFO - TaskInstance 
Finished: dag_id=sensor_example, task_id=partner_b, 
run_id=scheduled__2022-05-09T17:15:00+00:00, map_index=-1, 
run_start_date=2022-05-09 17:20:00.834723+00:00, run_end_date=2022-05-09 
17:20:11.021978+00:00, run_duration=10.187255, state=queued, 
executor_state=success, try_number=1, max_tries=0, job_id=78, 
pool=default_pool, queue=default, priority_weight=1, operator=PythonSensor, 
queued_dttm=2022-05-09 17:20:11.226889+00:00, queued_by_job_id=72, pid=462
   [2022-05-09 17:20:11,235] {scheduler_job.py:669} ERROR - Executor reports 
task instance <TaskInstance: sensor_example.partner_b 
scheduled__2022-05-09T17:15:00+00:00 [queued]> finished (success) although the 
task says its queued. (Info: None) Was the task killed externally?
   [2022-05-09 17:20:11,237] {taskinstance.py:1890} ERROR - Executor reports 
task instance <TaskInstance: sensor_example.partner_b 
scheduled__2022-05-09T17:15:00+00:00 [queued]> finished (success) although the 
task says its queued. (Info: None) Was the task killed externally?
   [2022-05-09 17:20:11,240] {taskinstance.py:1394} INFO - Marking task as 
FAILED. dag_id=sensor_example, task_id=partner_b, 
execution_date=20220509T171500, start_date=20220509T172000, 
end_date=20220509T172011
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to