benwatsonnandos opened a new issue #14955:
URL: https://github.com/apache/airflow/issues/14955


   **Apache Airflow version**: 2.0.1
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): N/A
   
   **Environment**: Ubuntu 20.04, Python 3.9
   
   - **Cloud provider or hardware configuration**: AWS EC2 (single instance)
   - **OS** (e.g. from /etc/os-release): Ubuntu 20.04
   - **Kernel** (e.g. `uname -a`): 5.4.0-1038-aws #40-Ubuntu SMP Fri Feb 5 
23:50:40 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
   - **Install tools**:
   - **Others**: Python 3.9 is a `venv`.
   
   **What happened**:
   
   I'm upgrading my old Airflow 1.10.5 and Python 2.7 environment to Airflow 
2.0.1 and Python 3.9. I have a file `utils.py` in `~/airflow/plugins` 
(referenced in `airflow.cfg::plugins_folder` that contains a macro that is used 
by multiple DAGs (file contents simplified for brevity):
   
   ```python
   from airflow.plugins_manager import AirflowPlugin
   
   def ret_one():
       return 1
   
   class AirflowPluginsTest(AirflowPlugin):
       name = "ret_one"
       macros = [ret_one]
   ```
   
   When restarting Airflow then I can see `ret_one` as a macro in `Admin -> 
Plugins` in the UI, but using `from airflow.macros import ret_one` in a DAG or 
ipython results in a missing module exception and the macro can't be used. This 
worked in Airflow 1.10.5.
   
   
   **What you expected to happen**:
   
   The macro should be found and should be available to DAGs.
   
   
   **How to reproduce it**:
   
   Run Airflow 2.0.1 on Python 3.9. Create a custom macro (like the above), and 
try to import it in a DAG/ipython.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to