prongs commented on issue #9722: URL: https://github.com/apache/airflow/issues/9722#issuecomment-823212675
This is not working in airflow 2.0 now. Here is my setup DAG bag is located at `/opt/airflow/dags` #### local module named `b` ```bash airflow@airflow-execteam-azuredatabricks-688d565566-gwgks:/opt/airflow/dags$ ls b __init__.py __pycache__ b.py airflow@airflow-execteam-azuredatabricks-688d565566-gwgks:/opt/airflow/dags$ ``` #### dag code /opt/airflow/dags/a.py just copied from [tutorial](https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html) with an addition of `from b import *` as the first line ```python from b import * from datetime import timedelta from textwrap import dedent # The DAG object; we'll need this to instantiate a DAG from airflow import DAG # Operators; we need this to operate! from airflow.operators.bash import BashOperator from airflow.utils.dates import days_ago # These args will get passed on to each operator # You can override them on a per-task basis during operator initialization default_args = { 'owner': 'airflow', 'depends_on_past': False, 'email': ['[email protected]'], 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), # 'queue': 'bash_queue', # 'pool': 'backfill', # 'priority_weight': 10, # 'end_date': datetime(2016, 1, 1), # 'wait_for_downstream': False, # 'dag': dag, # 'sla': timedelta(hours=2), # 'execution_timeout': timedelta(seconds=300), # 'on_failure_callback': some_function, # 'on_success_callback': some_other_function, # 'on_retry_callback': another_function, # 'sla_miss_callback': yet_another_function, # 'trigger_rule': 'all_success' } with DAG( 'tutorial', default_args=default_args, description='A simple tutorial DAG', schedule_interval=timedelta(days=1), start_date=days_ago(2), tags=['example'], ) as dag: # t1, t2 and t3 are examples of tasks created by instantiating operators t1 = BashOperator( task_id='print_date', bash_command='date', ) t2 = BashOperator( task_id='sleep', depends_on_past=False, bash_command='sleep 5', retries=3, ) dag.doc_md = __doc__ t1.doc_md = dedent( """\ #### Task Documentation You can document your task using the attributes `doc_md` (markdown), `doc` (plain text), `doc_rst`, `doc_json`, `doc_yaml` which gets rendered in the UI's Task Instance Details page.  """ ) templated_command = dedent( """ {% for i in range(5) %} echo "{{ ds }}" echo "{{ macros.ds_add(ds, 7)}}" echo "{{ params.my_param }}" {% endfor %} """ ) t3 = BashOperator( task_id='templated', depends_on_past=False, bash_command=templated_command, params={'my_param': 'Parameter I passed in'}, ) t1 >> [t2, t3] ``` Now, `airflow dags list` reports ``` dag_id | filepath | owner | paused ===============================================+===================================================+=========================+======= tutorial | a.py | airflow | None ``` And `airflow dags report` reports ``` file | duration | dag_num | task_num | dags ===================================================+================+=========+==========+=============================================== /a.py | 0:00:00.295103 | 1 | 3 | tutorial ``` However, when I open the web UI, I see the following  Here's the version we're on  -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
