turbaszek commented on pull request #11778:
URL: https://github.com/apache/airflow/pull/11778#issuecomment-717299031


   I have no idea why but after those changes I'm unable to run the following 
DAGs:
   ```
   from airflow import DAG
   from airflow.operators.dagrun_operator import TriggerDagRunOperator
   from airflow.operators.dummy_operator import DummyOperator
   from airflow.utils.dates import days_ago
   
   with DAG("test_dr", start_date=days_ago(1), schedule_interval=None) as dag:
       tr = TriggerDagRunOperator(task_id="trigger", 
trigger_dag_id="triggered_dag")
   
   with DAG("triggered_dag", start_date=days_ago(1), schedule_interval=None) as 
dag2:
       DummyOperator(task_id="test")
   ```
   
   What I do?
   ```
   ./breeze --python=3.8 --backend=postgres
   export AIRFLOW__CORE__EXECUTOR=LocalExecutor
   airflow webserver -w 1 -D
   airflow scheduler
   ```
   Then I unpause tboth DAGs in webui and trigger the "test_dr".
   
   What I get:
   ```
   root@d07e8a7697db:/opt/airflow# airflow scheduler
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   [2020-10-27 14:44:50,759] {scheduler_job.py:1269} INFO - Starting the 
scheduler
   [2020-10-27 14:44:50,759] {scheduler_job.py:1274} INFO - Processing each 
file at most -1 times
   [2020-10-27 14:44:50,761] {scheduler_job.py:1296} INFO - Resetting orphaned 
tasks for active dag runs
   [2020-10-27 14:44:50,806] {dag_processing.py:250} INFO - Launched 
DagFileProcessorManager with pid: 3458
   [2020-10-27 14:44:50,861] {settings.py:49} INFO - Configured default 
timezone Timezone('UTC')
   [2020-10-27 14:46:29,369] {scheduler_job.py:973} INFO - 1 tasks up for 
execution:
        <TaskInstance: test_dr.trigger 2020-10-27 14:46:28.689412+00:00 
[scheduled]>
   [2020-10-27 14:46:29,378] {scheduler_job.py:1007} INFO - Figuring out tasks 
to run in Pool(name=default_pool) with 128 open slots and 1 task instances 
ready to be queued
   [2020-10-27 14:46:29,379] {scheduler_job.py:1035} INFO - DAG test_dr has 
0/16 running and queued tasks
   [2020-10-27 14:46:29,381] {scheduler_job.py:1088} INFO - Setting the 
following tasks to queued state:
        <TaskInstance: test_dr.trigger 2020-10-27 14:46:28.689412+00:00 
[scheduled]>
   [2020-10-27 14:46:29,390] {scheduler_job.py:1134} INFO - Sending 
TaskInstanceKey(dag_id='test_dr', task_id='trigger', 
execution_date=datetime.datetime(2020, 10, 27, 14, 46, 28, 689412, 
tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue 
default
   [2020-10-27 14:46:29,391] {base_executor.py:78} INFO - Adding to queue: 
['airflow', 'tasks', 'run', 'test_dr', 'trigger', 
'2020-10-27T14:46:28.689412+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/files/dags/td_test.py']
   [2020-10-27 14:46:29,392] {sequential_executor.py:57} INFO - Executing 
command: ['airflow', 'tasks', 'run', 'test_dr', 'trigger', 
'2020-10-27T14:46:28.689412+00:00', '--local', '--pool', 'default_pool', 
'--subdir', '/files/dags/td_test.py']
   [2020-10-27 14:46:41,266] {dagbag.py:436} INFO - Filling up the DagBag from 
/files/dags/td_test.py
   Running <TaskInstance: test_dr.trigger 2020-10-27T14:46:28.689412+00:00 
[None]> on host d07e8a7697db
   ```
   But the task is not executed


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to