potiuk commented on issue #31213:
URL: https://github.com/apache/airflow/issues/31213#issuecomment-2254350184

   Sure. It's all rather well defined. Let me describe the options of testing. 
Might be also useful for others:
   
   1)  Testing it locallu with airflow:
   
   ```
   breeze start-airflow --database-isolation --standalone-dag-processor 
--executor  CeleryExecutor --load-example-dags --load-default-connections
   ```
   
   This one will start Breeze container (the first time image might need to be 
build if you've never done it). Then once you enter the container it will run 
airflow in multiple tereminals (it splits terminal using `tmux`) - so you will 
see all the airflow components running. You will be able to connect to airflow 
locally on http://localhost:28080 and DAGs should nicely be triggerable and 
``generally`` should work.
   
   Breeze is mounting the code from your project to the container, so once you 
modify the code, it's enough to Ctrl-C the component, press up arrow to see 
last command and run it again.
   
   This is the single most useful tool to test things end-to-end
   
   2) Unit tests: our `interna_api` tests in  `tests/api_internal` - and there 
you can test both the endoint and the call - this one can. run as regular unit 
tests you just need airflow venv for that `pip install -e ".[devel,pydantic]"` 
I think should be enoug to run the tests (maybe some other devel extras needed) 
- see 
https://airflow.apache.org/docs/apache-airflow/stable/extra-packages-ref.html#development-extras
   
   3) As of recently we also have an easy way to run unit tests in isolation 
mode. Not all tests are supposed to succed there and we are working on 
fixing/exclusing the tests that shoud be exclued in 
https://github.com/apache/airflow/pull/41067  - but you can run some tests that 
are using DB isolation already using database isolation. It works in the way 
that you need to enter `breeze shell --database-isolation`, split the terminals 
with tmux, run internal-api component in one of them and then you can run 
pytest tests in the other terminal. Again - sources are mounted to inside the 
container so you can easily iterate on tests (ctrl-c restart internal-api to 
pick up changes). 
   
   One tests that fully works for now is `dag_processing/test_job_runner.py` 
for example and a good start.
   
   4) What needs to be moved: 
   
   
https://github.com/apache/airflow/blob/a482d0f8513a2db633ae3409b0264d6c30dc6357/airflow/serialization/serialized_objects.py#L618
  -> currently everything is controlled by this one and a lot of Pydantic 
classes there + some other classes that were added to `serialized_objcects' to 
make internal_api works.  For now running the end-to-end `start-airflow` is the 
easiest way to test it - once we complete the "db-isolated tests` 
https://github.com/apache/airflow/pull/41067 we will be able to verify it in 
regular CI PR. 
   
   
   
   
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to