Rafnel opened a new issue, #31487:
URL: https://github.com/apache/airflow/issues/31487
### Apache Airflow version
Other Airflow 2 version (please specify below)
### What happened
I have an Airflow DAG (Airflow 2.6.0) where the code looks like this:
```
DAG(dag_id="my_dag",`
start_date=pendulum.datetime(2023, 5, 1, tz="America/Chicago"),
schedule_interval="0 4 * * *", # Every day at 4AM Central
catchup=False) as dag:
```
Note that this is a timezone-aware DAG for the America/Chicago timezone:
https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/timezone.html#time-zone-aware-dags
We turned this DAG on yesterday evening and it immediately began running for
its 2023-05-22 run, which we allowed to happen:

Then we expect at 4AM today, 2023-05-23 it would run, but it never did.

You can see the data interval ended at 4am this morning but the DAG never
triggered. it even says the next run is 5 hours ago (screenshot taken around
8:30AM America/Chicago time).
Then, to add to my confusion, at exactly 9AM America/Chicago time (14:00
UTC), the DAG kicked off and began running:

This plainly seems like a bug to me, but I was hoping for a second opinion..
Is there something I'm doing wrong here? Why did it not run at 4am? and I'm
really confused how the cron of `0 4 * * *` translates to run at 9AM CDT/2PM
UTC.
Thank you!
### What you think should happen instead
I believe the DAG should have run after the data interval end, at 4AM CDT
instead of 9AM CDT.
### How to reproduce
DAG code:
```
DAG(dag_id="my_dag",`
start_date=pendulum.datetime(2023, 5, 1, tz="America/Chicago"),
schedule_interval="0 4 * * *", # Every day at 4AM Central
catchup=False) as dag:
```
and my airflow.cfg has these two configs if it makes any difference:
`default_timezone = America/Chicago`
`default_ui_timezone = America/Chicago`
### Operating System
Ubuntu 20.04 LTS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
Deploying Airflow on "bare metal":
1. Set up Python virtualenv
2. point dags_folder in airflow.cfg to my dag code folder
3. Change sql_alchemy_conn in airflow.cfg to connection string to our
production Airflow MSSQL DB (mssql+pyodbc). Driver is FreeTDS
4. Run airflow db init
### Anything else
Seems like there is a constant +5 hours being added to any cron schedule I
make.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]