daoudjahdou edited a comment on issue #15978:
URL: https://github.com/apache/airflow/issues/15978#issuecomment-878909135


   Just tried it on both 2.1.1 and 1.10.15 same behaviour, i even tried the 
example found here: 
   
https://airflow.apache.org/docs/apache-airflow/1.10.15/_modules/airflow/example_dags/example_subdag_operator.html
   
   ```py
   from airflow.example_dags.subdags.subdag import subdag
   from airflow.models import DAG
   from airflow.operators.dummy_operator import DummyOperator
   from airflow.operators.subdag_operator import SubDagOperator
   from airflow.utils.dates import days_ago
   
   DAG_NAME = 'example_subdag_operator'
   
   args = {
       'owner': 'Airflow',
       'start_date': days_ago(2),
   }
   
   dag = DAG(
       dag_id=DAG_NAME,
       default_args=args,
       schedule_interval="@once",
       tags=['example']
   )
   
   start = DummyOperator(
       task_id='start',
       dag=dag,
   )
   
   section_1 = SubDagOperator(
       task_id='section-1',
       subdag=subdag(DAG_NAME, 'section-1', args),
       dag=dag,
   )
   
   some_other_task = DummyOperator(
       task_id='some-other-task',
       dag=dag,
   )
   
   section_2 = SubDagOperator(
       task_id='section-2',
       subdag=subdag(DAG_NAME, 'section-2', args),
       dag=dag,
   )
   
   end = DummyOperator(
       task_id='end',
       dag=dag,
   )
   
   start >> section_1 >> some_other_task >> section_2 >> end
   ```
   
   The log is the following:
   ```
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   Starting flask
    * Serving Flask app "airflow.utils.serve_logs" (lazy loading)
    * Environment: production
      WARNING: This is a development server. Do not use it in a production 
deployment.
      Use a production WSGI server instead.
    * Debug mode: off
   [2021-07-13 08:57:29,682] {_internal.py:113} INFO -  * Running on 
http://0.0.0.0:8793/ (Press CTRL+C to quit)
   [2021-07-13 08:57:29,690] {scheduler_job.py:1270} INFO - Starting the 
scheduler
   [2021-07-13 08:57:29,691] {scheduler_job.py:1275} INFO - Processing each 
file at most -1 times
   [2021-07-13 08:57:29,696] {dag_processing.py:254} INFO - Launched 
DagFileProcessorManager with pid: 129
   [2021-07-13 08:57:29,697] {scheduler_job.py:1839} INFO - Resetting orphaned 
tasks for active dag runs
   [2021-07-13 08:57:29,701] {settings.py:52} INFO - Configured default 
timezone Timezone('UTC')
   [2021-07-13 08:57:29,711] {dag_processing.py:532} WARNING - Because we 
cannot use more than 1 thread (parsing_processes = 2 ) when using sqlite. So we 
set parallelism to 1.
   [2021-07-13 08:57:37,907] {scheduler_job.py:964} INFO - 1 tasks up for 
execution:
           <TaskInstance: example_subdag_operator.section-1 2021-07-11 
00:00:00+00:00 [scheduled]>
   [2021-07-13 08:57:37,909] {scheduler_job.py:998} INFO - Figuring out tasks 
to run in Pool(name=default_pool) with 128 open slots and 1 task instances 
ready to be queued
   [2021-07-13 08:57:37,909] {scheduler_job.py:1025} INFO - DAG 
example_subdag_operator has 0/16 running and queued tasks
   [2021-07-13 08:57:37,909] {scheduler_job.py:1086} INFO - Setting the 
following tasks to queued state:
           <TaskInstance: example_subdag_operator.section-1 2021-07-11 
00:00:00+00:00 [scheduled]>
   [2021-07-13 08:57:37,911] {scheduler_job.py:1128} INFO - Sending 
TaskInstanceKey(dag_id='example_subdag_operator', task_id='section-1', 
execution_date=datetime.datetime(2021, 7, 11, 0, 0, tzinfo=Timezone('UTC')), 
try_number=1) to executor with priority 4 and queue default
   [2021-07-13 08:57:37,911] {base_executor.py:82} INFO - Adding to queue: 
['airflow', 'tasks', 'run', 'example_subdag_operator', 'section-1', 
'2021-07-11T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', 
'/opt/airflow/dags/monitor.py']
   [2021-07-13 08:57:37,922] {sequential_executor.py:59} INFO - Executing 
command: ['airflow', 'tasks', 'run', 'example_subdag_operator', 'section-1', 
'2021-07-11T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', 
'/opt/airflow/dags/monitor.py']
   [2021-07-13 08:57:39,332] {dagbag.py:496} INFO - Filling up the DagBag from 
/opt/airflow/dags/monitor.py
   Running <TaskInstance: example_subdag_operator.section-1 
2021-07-11T00:00:00+00:00 [queued]> on host 969b77ead72f
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to