rafidka edited a comment on issue #13824:
URL: https://github.com/apache/airflow/issues/13824#issuecomment-1055926552
I just tried a sleep DAG (similar to yours) and it also succeeded on Airflow
2.2.4:
```
[2022-03-01 14:25:43,323: INFO/MainProcess] Task
airflow.executors.celery_executor.execute_command[9f11d899-b415-4541-8e4f-1221fa7b6b09]
received
[2022-03-01 14:25:43,394: INFO/ForkPoolWorker-16] Executing command in
Celery: ['airflow', 'tasks', 'run', 'sleep', 'execute_fn',
'scheduled__2022-03-01T22:24:29.637891+00:00', '--local', '--subdir',
'DAGS_FOLDER/sleep.py']
[2022-03-01 14:25:43,394: INFO/ForkPoolWorker-16] Celery task ID:
9f11d899-b415-4541-8e4f-1221fa7b6b09
[2022-03-01 14:25:43,438: INFO/ForkPoolWorker-16] Filling up the DagBag from
/root/airflow/dags/sleep.py
[2022-03-01 14:25:43,498: WARNING/ForkPoolWorker-16] Running <TaskInstance:
sleep.execute_fn scheduled__2022-03-01T22:24:29.637891+00:00 [queued]> on host
1161269d3561
[2022-03-01 14:25:54,463: INFO/ForkPoolWorker-16] Task
airflow.executors.celery_executor.execute_command[9f11d899-b415-4541-8e4f-1221fa7b6b09]
succeeded in 11.133019998000236s: None
```
This is my DAG:
```Python
import time
from datetime import timedelta
from airflow.decorators import dag, task
from airflow.utils.dates import days_ago
import os
DAG_ID = os.path.basename(__file__).replace(".py", "")
@dag(dag_id=DAG_ID, schedule_interval=timedelta(minutes=1), catchup=False,
start_date=days_ago(0), tags=['test'])
def sleep_dag():
@task()
def execute_fn():
time.sleep(10)
execute_fn_t = execute_fn()
test_dag_d = sleep_dag()
```
I suspect your setup have some issue (perhaps some stale configuration or
package.) I would start clean or -even better- use Docker if you aren't already.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]