yogaliving opened a new issue, #35906:
URL: https://github.com/apache/airflow/issues/35906
### Description
I am running airflow 2.7.3 in a single machine, airflow is installed in a
python virtualenv, the airflow_home is /mnt/bmr/airflow.
`#!/usr/bin/python
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
import subprocess
default_args = {
'owner': 'test_owner',
'depends_on_past': False,
'start_date':datetime(2023,11,13),
'retries': 0,
'retry_delay': timedelta(minutes=1),
}
dag = DAG('tsy-succ-test', default_args=default_args,
schedule_interval=timedelta(days=1))
t1 = BashOperator(
task_id='t1',
bash_command='hdfs dfs -ls /',
run_as_user='hdfs',
dag=dag)`
use this dag will failed,and log said airflow.exceptions.DagRunNotFound:
DagRun for tsy-succ-test with run_id or execution_date of
'scheduled__2023-11-27T00:00:00+00:00' not found
but i can find scheduled__2023-11-27T00:00:00+00:00 in the mysql
if i use this dag
`#!/usr/bin/python
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
import subprocess
default_args = {
'owner': 'test_owner',
'depends_on_past': False,
'start_date':datetime(2023,11,13),
'retries': 0,
'retry_delay': timedelta(minutes=1),
}
dag = DAG('tsy-succ-test', default_args=default_args,
schedule_interval=timedelta(days=1))
t1 = BashOperator(
task_id='t1',
bash_command='hdfs dfs -ls /',
dag=dag)`
it will be succ

I start airflow with root user
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]