mzecov opened a new issue, #33918:
URL: https://github.com/apache/airflow/issues/33918
### Apache Airflow version
2.7.0
### What happened
Variable `{{ prev_start_date_success }}` returns unexpected results when
manually triggering DAG.
With following DAG definition file
```py
with DAG(
"demo",
schedule="*/1 * * * *",
start_date=datetime(2023, 1, 1),
max_active_runs=1,
catchup=False,
) as dag:
t1 = BashOperator(
task_id="templated",
depends_on_past=False,
bash_command="""echo "{{ prev_start_date_success }}" """,
)
t1
```
(one task only displaying `prev_start_date_success`, every minute for demo
purposes)
With following DAG runs:
- DAG run # 1: automatically triggered at 2023-08-30, 12:03:00 UTC;
data_interval_start is 2023-08-30, 12:02:00 UTC
- DAG run # 2: **manually** triggered **at 2023-08-30, 12:03:34 UTC**;
data_interval_start is 2023-08-30, 12:02:00 UTC
- DAG run # 3: automatically triggered at 2023-08-30, 12:04:00 UTC;
data_interval_start is 2023-08-30, 12:03:00 UTC
- DAG run # 4: automatically triggered at 2023-08-30, 12:05:00 UTC;
data_interval_start is 2023-08-30, 12:04:00 UTC
Logs:
- DAG run # 2: `['/bin/bash', '-c', 'echo "2023-08-30T12:03:00.378342+00:00"
']`
- DAG run # 3: `['/bin/bash', '-c', 'echo "2023-08-30T12:03:00.378342+00:00"
']`
- DAG run # 4: `['/bin/bash', '-c', 'echo "2023-08-30T12:03:34.363933+00:00"
']`
### What you think should happen instead
[Here](https://airflow.apache.org/docs/apache-airflow/stable/templates-ref.html):
`prev_start_date_success` is *"Start date from prior successful DagRun (if
available)."*. But:
- DAG run # 3 prints start date of DAG # 1, not # 2 (manually triggered). I
thought maybe runs are sorted on `data_interval_start` to get *"prior"*. And
this value is the same for DAG run # 1 and # 2. Or maybe manual triggers are
excluded.
- But why DAG # 4 prints start date of DAG # 2 ? *"prior"* DAG run should be
# 3 not # 2
- Is it a bug?
- Else, can we explain this in documentation?
Note that with `CronTriggerTimetable`, results are different and more
coherent (# 3 uses # 2 and # 4 uses # 3).
### How to reproduce
- Use upper DAG definition file
- Trigger DAG in Airflow GUI (manually), between two schedules
- Observe logs of following two DAG runs.
### Operating System
Runs in container (image 2.7.0-python3.10)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]