The GitHub Actions job "Tests" on airflow.git/improve-task-sdk-integration-iteration-speed has succeeded. Run started by GitHub user potiuk (triggered by potiuk).
Head commit for run: 36ce448d358c8e50f158173c5f51f7feb6a81c03 / Jarek Potiuk <[email protected]> Make it easier and faster to iterate on task-sdk-integration-tests There were a few of things that made task-sdk-tests iteration a bit unobvious and slower. With this PR, we should be able to iterate over task-sdk-integration-tests WAY faster and get more contributors involved in contributing to those. * It was not clear that prerequisite of running the tests was building PROD image for Pyton 3.10. This is now clear in the documentation. * PROD images can be built in two different modes - from sources with --installation-method equal to . or from packages with the --installatio-method equal to "apache-airflow". This was not clearly communicated during build and it is now printed at output * It was not clear that when you build PROD images from sources, you should first compile ui assets, because otehrwise the assets are not added as part of the image. With this PR the `breeze prod-image build` command checks if the .vite manifest is present in the right `dist` folders and will error out, suggesting to run `breeze compile-ui-assets` before. * If the PROD image has not been built before, breeze will propose to build it and even do it automatically if the answer is not provided within 20 seconds. * when building PROD images from sources, it is faster to rebuild the images with `uv` than with `pip`. the --use-uv parameter now defaults to False when building from packages and to True when building from sources. * There was an error in .dockerignore where generated dist files were not added to context when PROD image was built from sources. This resulted in "permission denied' when such PROD images were used to run tests. * The test compose had fallback of Airflow 3.0.3 which would be misleading if it happened. Now, AIRFLOW_IMAGE_NAME is mandatory * We are now mounting sources of Airflow to inside the image by default and skip it in CI. This mounting happens in local environment where PROD image is built usually from sources, and it is disabled in CI by using --skip-mounting-local-volumes flag. We also do not stop docker compose by default when runnig it locally in order to make fast iteration the default. * We pass host operating system when starting the compose, and we only change ownership on Linux - this is a long running operation on MacOS because mounted filesystem is slow, but it's also not needed on MacOS because the file system also maps ownershipt and files created by Airflow are created with local user id. * We pass local user id to containers to make sure that the files created on linux are created by the local user (logs and the like). * We are now detecting whether docker-compose is running and when we run with locally mounted sources, we reuse those running containers. When we don't mount local sources, we shut-down the compose before running to make sure we do not have sources mounted - and we close the compose by default when we do not mount local sources. * When sources are mounted we are only enabling DEV_MODE inside the containers so that components are hot-reloading (new feature added in #57741 last weeks. This way you do not have to restart anything when sources are changed and you can re-run the tests when docker compose is running. * The environment are passsed now via .env file so that you can easily reproduce docke compose command locally * The docker compose files are not copied any more, they are moved directly to the top of 'task-sdk-integraiton-tests' and used from there. * A `--down` flag is added to breeze testing task-sdk-integration-tests that tears down running docker compose. * Additional diagnostics added to show what's going on. * Handling verbose option from breeze by adding more debugging information * Updated documentation about the tests to be more comprehensive about the options, updated file structure etc. * Small QOL immprovement - if expected dags are not yet parsed by dag file processor, when test starts, getting their status will return 404 "Not Found". In such case our tests implemented a short retry scheme with tenacity Report URL: https://github.com/apache/airflow/actions/runs/19334662191 With regards, GitHub Actions via GitBox --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
