stacymiller opened a new issue, #51002:
URL: https://github.com/apache/airflow/issues/51002

   ### Apache Airflow version
   
   3.0.1
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   I took a sample TaskFlow API DAG from [the 
documentation](https://airflow.apache.org/docs/apache-airflow/3.0.1/core-concepts/dags.html#declaring-a-dag),
 installed all required packages and tried to call the `test()` method as 
mentioned in [the 
documentation](https://airflow.apache.org/docs/apache-airflow/3.0.1/core-concepts/debug.html#testing-dags-with-dag-test).
 This is what I usually do to test my DAGs, and on 2.* everything was OK. 
However, I received the following error message with Airflow 3.0.1:
   ```
   Traceback (most recent call last):
     File "/data/test.py", line 62, in <module>
       dag.test()
   AttributeError: 'DAG' object has no attribute 'test'
   ```
   
   ### What you think should happen instead?
   
   The DAG should be executed locally. This is the behaviour I observe in 
Airflow 2.10, and this was not expected to change.
   
   ### How to reproduce
   
   The file with DAG (both TaskFlow API and the classic API have this issue), 
assume it is called `test.py`:
   
   ```python
   # test.py for Airflow 3
   import datetime
   from airflow.sdk import dag
   from airflow.providers.standard.operators.empty import EmptyOperator
   
   
   @dag(start_date=datetime.datetime(2021, 1, 1), schedule="@daily")
   def generate_dag():
       EmptyOperator(task_id="task")
   
   
   my_dag = generate_dag()
   
   
   if __name__ == '__main__':
       my_dag.test()
   ```
   
   The command to run this code:
   ```shell
   python test.py
   ```
   
   The Dockerfile imitating the setup on the DAG author's machine:
   ```Dockerfile
   FROM python:3.10-slim
   
   COPY test.py test.py
   
   RUN  python -m pip install "apache-airflow[celery]==3.0.1" --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-3.0.1/constraints-3.9.txt";
   
   RUN echo '#!/bin/bash\n\
   airflow standalone > airflow.log 2>&1 &\n\
   echo 'Wait until init...'\n\
   sleep 15\n\
   python test.py\n\
   ' > /entrypoint.sh && chmod +x /entrypoint.sh
   
   CMD ["/entrypoint.sh"]
   ```
   
   ### Operating System
   
   macOS 15.4.1 locally, Debian in Docker
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   The setup that I use to validate that in Airflow 2 everything is working as 
expected:
   ```python
   # test.py for Airflow 2
   import datetime
   
   from airflow.decorators import dag
   from airflow.operators.empty import EmptyOperator
   
   
   @dag(start_date=datetime.datetime(2021, 1, 1), schedule="@daily")
   def generate_dag():
       EmptyOperator(task_id="task")
   
   
   my_dag = generate_dag()
   
   
   if __name__ == '__main__':
       my_dag.test()
   ```
   
   ```Dockerfile
   # Dockerfile for Airflow 2
   FROM python:3.10-slim
   
   COPY test.py test.py
   
   RUN python -m pip install "apache-airflow[celery]==2.10.5" --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.10.5/constraints-3.8.txt";
   
   RUN echo '#!/bin/bash\n\
   airflow standalone > airflow.log 2>&1 &\n\
   echo 'Wait until init...'\n\
   sleep 15\n\
   python test.py\n\
   ' > /entrypoint.sh && chmod +x /entrypoint.sh
   
   CMD ["/entrypoint.sh"]
   ```
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to