Tonkonozhenko opened a new issue #22224:
URL: https://github.com/apache/airflow/issues/22224
### Apache Airflow version
2.2.4 (latest released)
### What happened
Task runs a second time after a first successful run.
In logs I see suspicious line:
```
[2022-03-13, 09:10:59 UTC] {local_task_job.py:99} INFO - Task is not able to
be run
```
Full log:
```*** Reading remote log from
s3://.../airflow/dag_id/task_id/2022-03-13T08:10:00+00:00/1.log.
[2022-03-13, 09:10:59 UTC] {taskinstance.py:1027} INFO - Dependencies not
met for <TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00
[scheduled]>, dependency 'Task Instance State' FAILED: Task is in the
'scheduled' state.
[2022-03-13, 09:10:59 UTC] {local_task_job.py:99} INFO - Task is not able to
be run
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1037} INFO - Dependencies all
met for <TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00
[queued]>
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1037} INFO - Dependencies all
met for <TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00
[queued]>
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1244} INFO - Starting attempt 1
of 3
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1264} INFO - Executing
<Task(PythonOperator): task_id> on 2022-03-13 08:10:00+00:00
[2022-03-13, 09:11:16 UTC] {standard_task_runner.py:52} INFO - Started
process 17071 to run task
[2022-03-13, 09:11:16 UTC] {standard_task_runner.py:76} INFO - Running:
['airflow', 'tasks', 'run', 'dag_id', 'task_id',
'scheduled__2022-03-13T08:10:00+00:00', '--job-id', '1331338', '--raw',
'--subdir', 'DAGS_FOLDER/dag_id.py', '--cfg-path', '/tmp/tmp6jzxb7ar',
'--error-file', '/tmp/tmpru98mcz9']
[2022-03-13, 09:11:16 UTC] {standard_task_runner.py:77} INFO - Job 1331338:
Subtask task_id
[2022-03-13, 09:11:16 UTC] {logging_mixin.py:109} INFO - Running
<TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00 [running]>
on host airflow-official-worker-7d594658ff-9d4nf
[2022-03-13, 09:11:17 UTC] {taskinstance.py:1429} INFO - Exporting the
following env vars:
AIRFLOW_CTX_DAG_EMAIL=...
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=dag_id
AIRFLOW_CTX_TASK_ID=task_id
AIRFLOW_CTX_EXECUTION_DATE=2022-03-13T08:10:00+00:00
AIRFLOW_CTX_DAG_RUN_ID=scheduled__2022-03-13T08:10:00+00:00
[2022-03-13, 09:11:17 UTC] {python.py:175} INFO - Done. Returned value was:
None
[2022-03-13, 09:11:17 UTC] {taskinstance.py:1272} INFO - Marking task as
SUCCESS. dag_id=dag_id, task_id=task_id, execution_date=20220313T081000,
start_date=20220313T091116, end_date=20220313T091117
[2022-03-13, 09:11:17 UTC] {local_task_job.py:154} INFO - Task exited with
return code 0
[2022-03-13, 09:11:17 UTC] {local_task_job.py:264} INFO - 0 downstream tasks
scheduled from follow-on schedule check
```
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Debian 10, extended official airflow image
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else
We have some hourly dags and it happens once per few days.
Full Logs:
```*** Reading remote log from
s3://.../airflow/dag_id/task_id/2022-03-13T08:10:00+00:00/1.log.
[2022-03-13, 09:10:59 UTC] {taskinstance.py:1027} INFO - Dependencies not
met for <TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00
[scheduled]>, dependency 'Task Instance State' FAILED: Task is in the
'scheduled' state.
[2022-03-13, 09:10:59 UTC] {local_task_job.py:99} INFO - Task is not able to
be run
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1037} INFO - Dependencies all
met for <TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00
[queued]>
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1037} INFO - Dependencies all
met for <TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00
[queued]>
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1244} INFO - Starting attempt 1
of 3
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1245} INFO -
--------------------------------------------------------------------------------
[2022-03-13, 09:11:16 UTC] {taskinstance.py:1264} INFO - Executing
<Task(PythonOperator): task_id> on 2022-03-13 08:10:00+00:00
[2022-03-13, 09:11:16 UTC] {standard_task_runner.py:52} INFO - Started
process 17071 to run task
[2022-03-13, 09:11:16 UTC] {standard_task_runner.py:76} INFO - Running:
['airflow', 'tasks', 'run', 'dag_id', 'task_id',
'scheduled__2022-03-13T08:10:00+00:00', '--job-id', '1331338', '--raw',
'--subdir', 'DAGS_FOLDER/aws_transforms.py', '--cfg-path', '/tmp/tmp6jzxb7ar',
'--error-file', '/tmp/tmpru98mcz9']
[2022-03-13, 09:11:16 UTC] {standard_task_runner.py:77} INFO - Job 1331338:
Subtask task_id
[2022-03-13, 09:11:16 UTC] {logging_mixin.py:109} INFO - Running
<TaskInstance: dag_id.task_id scheduled__2022-03-13T08:10:00+00:00 [running]>
on host airflow-official-worker-7d594658ff-9d4nf
[2022-03-13, 09:11:17 UTC] {taskinstance.py:1429} INFO - Exporting the
following env vars:
AIRFLOW_CTX_DAG_EMAIL=...
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=dag_id
AIRFLOW_CTX_TASK_ID=task_id
AIRFLOW_CTX_EXECUTION_DATE=2022-03-13T08:10:00+00:00
AIRFLOW_CTX_DAG_RUN_ID=scheduled__2022-03-13T08:10:00+00:00
[2022-03-13, 09:11:17 UTC] {python.py:175} INFO - Done. Returned value was:
None
[2022-03-13, 09:11:17 UTC] {taskinstance.py:1272} INFO - Marking task as
SUCCESS. dag_id=dag_id, task_id=task_id, execution_date=20220313T081000,
start_date=20220313T091116, end_date=20220313T091117
[2022-03-13, 09:11:17 UTC] {local_task_job.py:154} INFO - Task exited with
return code 0
[2022-03-13, 09:11:17 UTC] {local_task_job.py:264} INFO - 0 downstream tasks
scheduled from follow-on schedule check
```
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]