JavierLopezT edited a comment on issue #15649:
URL: https://github.com/apache/airflow/issues/15649#issuecomment-849554693


   > Are you running the "get_pandas_df" at the top-level of your DAG file? if 
so, then yes - scheduler will try to execute it while parsing the DAG. Python 
DAG file is "imported" and "executed" by the scheduler regularly. All the 
processing should happen in the operator's "execute()'' methods (either 
operators you use or the custom operators you write).
   
   No, nothing is in the DAG file but the definition of the operators, as the 
best practices state. The `get_pandas_df` is in a python file that is executed 
by a custom BashOperator. 
   
   dags
       import_blog
           dag_import_blog.py
           py_get_query.py
   
   In dag_import_blog.py:
   
   ```
   with DAG(blabla)
       task_1 = BashOperator(task_id='t1',
                     bash_command='python3 
/opt/airflow/dags/import/blog/py_get_query.py)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to