Lukas1v opened a new issue, #38104: URL: https://github.com/apache/airflow/issues/38104
### Apache Airflow version main (development) ### If "Other Airflow 2 version" selected, which one? _No response_ ### What happened? Following this documentation on setting up Github Codespaces https://github.com/apache/airflow/blob/7ec2407e7f50251fff4359ffe7109dd8dc360f24/contributing-docs/quick-start-ide/contributors_quick_start_codespaces.rst : - Forked the project, created a new codespace. - run breeze command: ` root@a320af5e0f55:/opt/airflow# breeze bash: breeze: command not found ` - Install breeze manually using `/opt/airflow# pipx install -e ./dev/breeze` - again initialize breeze: ``` root@a320af5e0f55:/opt/airflow# breeze --python 3.8 --backend postgres The value 5.7 is not allowed for parameter mysql_version. Setting default value to 8.0 The value 10 is not allowed for parameter postgres_version. Setting default value to 12 CI image for Python 3.8 was never built locally or was deleted. Forcing build. Found 89 providers with 1228 Python files. Building CI Image for Python 3.8 ERROR: BuildKit is enabled but the buildx component is missing or broken. Install the buildx component to build images with BuildKit: https://docs.docker.com/go/buildx/ Your image build failed. It could be caused by conflicting dependencies. Run 'breeze ci-image build --upgrade-to-newer-dependencies' to upgrade them. Good version of Docker: 25.0.4. You either do not have docker-composer or have docker-compose v1 installed. ``` - Installed Docker using: https://github.com/docker/docker-install/ - Breeze starts now, but still show an error in the output: ``` root@a320af5e0f55:/opt/airflow# breeze --python 3.8 --backend sqlite The value 5.7 is not allowed for parameter mysql_version. Setting default value to 8.0 The value 10 is not allowed for parameter postgres_version. Setting default value to 12 CI image for Python 3.8 was never built locally or was deleted. Forcing build. Found 89 providers with 1228 Python files. Building CI Image for Python 3.8 Using default as context. default Current context is now "default" [+] Building 879.9s (69/69) FINISHED docker:default => [internal] load build definition from Dockerfile.ci 0.4s => => transferring dockerfile: 55.54kB 0.0s => [internal] load .dockerignore 0.4s => => transferring context: 3.08kB 0.0s => resolve image config for docker.io/docker/dockerfile:1.4 1.4s => [auth] docker/dockerfile:pull token for registry-1.docker.io 0.0s => docker-image://docker.io/docker/dockerfile:1.4@sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced81e24ef1a0dbc 0.9s => => resolve docker.io/docker/dockerfile:1.4@sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced81e24ef1a0dbc 0.1s => => sha256:1328b32c40fca9bcf9d70d8eccb72eb873d1124d72dadce04db8badbe7b08546 9.94MB / 9.94MB 0.3s => => sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced81e24ef1a0dbc 2.00kB / 2.00kB 0.0s => => sha256:ad87fb03593d1b71f9a1cfc1406c4aafcb253b1dabebf569768d6e6166836f34 528B / 528B 0.0s => => sha256:1e8a16826fd1c80a63fa6817a9c7284c94e40cded14a9b0d0d3722356efa47bd 2.37kB / 2.37kB 0.0s => => extracting sha256:1328b32c40fca9bcf9d70d8eccb72eb873d1124d72dadce04db8badbe7b08546 0.2s => [internal] load metadata for docker.io/library/python:3.8-slim-bookworm 1.4s => [auth] library/python:pull token for registry-1.docker.io 0.0s => ERROR importing cache manifest from ghcr.io/lukas1v/airflow/main/ci/python3.8:cache-linux-amd64 0.6s => [internal] preparing inline document 2.0s => [internal] preparing inline document 0.5s => [internal] preparing inline document 1.0s => [internal] preparing inline document 1.2s => [internal] load build context 4.9s => => transferring context: 72.92MB 3.4s => [main 1/34] FROM docker.io/library/python:3.8-slim-bookworm@sha256:e796941013b10bb53a0924d8705485a1afe654bbbc6fe71d32509101e44b6414 2.8s => => resolve docker.io/library/python:3.8-slim-bookworm@sha256:e796941013b10bb53a0924d8705485a1afe654bbbc6fe71d32509101e44b6414 0.2s => => sha256:e796941013b10bb53a0924d8705485a1afe654bbbc6fe71d32509101e44b6414 1.86kB / 1.86kB 0.0s => => sha256:b5c1888e49ebea8010f93fb23daaf5e2c672752c16603b91dfdf609e1ee1805c 1.37kB / 1.37kB 0.0s => => sha256:978a13f89eb4fcb0ef557ae3618de49e2a3149942f4d44e5c80c7ca243fd34a2 6.97kB / 6.97kB 0.0s => [internal] preparing inline document 0.2s => [internal] preparing inline document 1.4s => [internal] preparing inline document 1.5s => [internal] preparing inline document 1.6s => [internal] preparing inline document 1.8s => [internal] preparing inline document 1.8s => [internal] preparing inline document 1.9s => [internal] preparing inline document 1.9s => [auth] lukas1v/airflow/main/ci/python3.8:pull token for ghcr.io 0.0s => [scripts 2/13] COPY <<EOF /install_os_dependencies.sh 0.8s => [main 2/34] RUN echo "Base image version: python:3.8-slim-bookworm" 1.7s => [scripts 3/13] COPY <<EOF /install_mysql.sh 0.5s => [scripts 4/13] COPY <<EOF /install_mssql.sh 0.6s => [scripts 5/13] COPY <<EOF /install_postgres.sh 0.4s => [scripts 6/13] COPY <<EOF /install_packaging_tools.sh 0.3s => [scripts 7/13] COPY <<EOF /install_airflow_dependencies_from_branch_tip.sh 0.3s => [scripts 8/13] COPY <<EOF /common.sh 0.3s => [scripts 9/13] COPY <<EOF /install_pipx_tools.sh 0.3s => [scripts 10/13] COPY <<EOF /install_airflow.sh 0.3s => [scripts 11/13] COPY <<EOF /install_additional_dependencies.sh 0.3s => [scripts 12/13] COPY <<EOF /entrypoint_ci.sh 0.3s => [scripts 13/13] COPY <<EOF /entrypoint_exec.sh 0.3s => [main 3/34] COPY --from=scripts install_os_dependencies.sh /scripts/docker/ 0.4s => [main 4/34] RUN bash /scripts/docker/install_os_dependencies.sh dev 179.9s => [main 5/34] COPY --from=scripts common.sh /scripts/docker/ 0.3s => [main 6/34] COPY --from=scripts install_mysql.sh install_mssql.sh install_postgres.sh /scripts/docker/ 0.4s => [main 7/34] RUN bash /scripts/docker/install_mysql.sh prod && bash /scripts/docker/install_mysql.sh dev && bash /scripts/docker/install_mssql.sh dev && bash /scripts/docker/install_postgr 30.6s => [main 8/34] RUN SYSTEM=$(uname -s | tr '[:upper:]' '[:lower:]') && PLATFORM=$([ "$(uname -m)" = "aarch64" ] && echo "arm64" || echo "amd64" ) && HELM_URL="https://get.helm.sh/helm-v3.9.4-${SYS 1.3s => [main 9/34] WORKDIR /opt/airflow 0.3s => [main 10/34] RUN mkdir -pv /root/airflow && mkdir -pv /root/airflow/dags && mkdir -pv /root/airflow/logs 0.5s => [main 11/34] RUN echo "Airflow version: 2.9.0.dev0" 0.5s => [main 12/34] COPY --from=scripts install_packaging_tools.sh install_airflow_dependencies_from_branch_tip.sh common.sh /scripts/docker/ 0.4s => [main 13/34] RUN bash /scripts/docker/install_packaging_tools.sh; if [[ true == "true" ]]; then bash /scripts/docker/install_airflow_dependencies_from_branch_tip.sh; fi 281.7s => [main 14/34] COPY --from=scripts install_pipx_tools.sh /scripts/docker/ 0.5s => [main 15/34] RUN bash /scripts/docker/install_pipx_tools.sh 15.0s => [main 16/34] COPY pyproject.toml /opt/airflow/pyproject.toml 0.4s => [main 17/34] COPY airflow/__init__.py /opt/airflow/airflow/ 0.3s => [main 18/34] COPY generated/* /opt/airflow/generated/ 0.4s => [main 19/34] COPY constraints/* /opt/airflow/constraints/ 0.3s => [main 20/34] COPY LICENSE /opt/airflow/LICENSE 0.3s => [main 21/34] COPY airflow_pre_installed_providers.txt /opt/airflow/ 0.3s => [main 22/34] COPY hatch_build.py /opt/airflow/ 0.3s => [main 23/34] COPY --from=scripts install_airflow.sh /scripts/docker/ 0.3s => [main 24/34] RUN bash /scripts/docker/install_airflow.sh 66.6s => [main 25/34] COPY --from=scripts entrypoint_ci.sh /entrypoint 0.4s => [main 26/34] COPY --from=scripts entrypoint_exec.sh /entrypoint-exec 0.3s => [main 27/34] RUN chmod a+x /entrypoint /entrypoint-exec 0.6s => [main 28/34] COPY --from=scripts install_packaging_tools.sh install_additional_dependencies.sh /scripts/docker/ 0.4s => [main 29/34] RUN bash /scripts/docker/install_packaging_tools.sh; if [[ -n "" ]]; then bash /scripts/docker/install_additional_dependencies.sh; fi 1.1s => [main 30/34] RUN if command -v airflow; then register-python-argcomplete airflow >> ~/.bashrc ; fi 0.7s => [main 31/34] RUN echo "source /etc/bash_completion" >> ~/.bashrc 0.6s => [main 32/34] COPY . /opt/airflow/ 4.1s => [main 33/34] WORKDIR /opt/airflow 0.3s => [main 34/34] RUN ln -sf /usr/bin/dumb-init /usr/local/bin/dumb-init 0.5s => exporting to image 245.8s => => exporting layers 245.7s => => writing image sha256:f4d7f72623ba91b47aa1ac8029eae575474b1d9f260cee539a56363f13acb232 0.0s => => naming to ghcr.io/lukas1v/airflow/main/ci/python3.8:latest 0.0s ------ > importing cache manifest from ghcr.io/lukas1v/airflow/main/ci/python3.8:cache-linux-amd64: ------ Attempting to generate provider dependencies. 89 provider.yaml file(s) changed since last check. Found 89 providers with 1228 Python files. Good version of Docker: 25.0.4. Good version of docker-compose: 2.24.7 Executable permissions on entrypoints are OK Unable to find image 'python:3.8-slim-bookworm' locally 3.8-slim-bookworm: Pulling from library/python 8a1e25ce7c4f: Already exists 1103112ebfc4: Already exists b7d41b19b655: Already exists 6a1ad0671ce8: Already exists de92c59aadaa: Already exists Digest: sha256:e796941013b10bb53a0924d8705485a1afe654bbbc6fe71d32509101e44b6414 Status: Downloaded newer image for python:3.8-slim-bookworm docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/opt/airflow/scripts/in_container/run_fix_ownership.py": stat /opt/airflow/scripts/in_container/run_fix_ownership.py: no such file or directory: unknown. @&&&&&&@ @&&&&&&&&&&&@ &&&&&&&&&&&&&&&& &&&&&&&&&& &&&&&&& &&&&&&& @@@@@@@@@@@@@@@@ &&&&&& @&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&&&&&&& &&&&&&&&& &&&&&&&&&&&& @@&&&&&&&&&&&&&&&@ @&&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&& &&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&& &&&&&&&&&&&&&&&&&&&&&&&& &&&&&& &&&&&& &&&&&&& @&&&&&&&& @&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& @&&&@ && @&&&&&&&&&&& &&&&&&&&&&&& && &&&&&&&&&& &&& &&& &&& &&& &&& && @&& &&& && && &&& &&&@ &&& &&&&& &&& &&& &&& && @&&&&&&&&&&&& &&&&&&&&&&& && && &&& &&& &&& &&@ &&& &&&&&&&&&&& && @&&&&&&&&& && && &&@ &&& &&@&& &&@&& &&& &&& && @&& &&&@ && &&&&&&&&&&& &&&&&&&&&&&& &&&& &&&& &&&&&&&&&&&& &&&&&&&&&&&& &&&&&&&&&&&@ &&&&&&&&&&&& &&&&&&&&&&& &&&&&&&&&&& &&& &&& && &&& && &&& &&&& && &&&&&&&&&&&&@ &&&&&&&&&&&& &&&&&&&&&&& &&&&&&&&&&& &&&& &&&&&&&&&& &&& && && &&&& && &&& &&&& && &&&&&&&&&&&&& && &&&&@ &&&&&&&&&&&@ &&&&&&&&&&&& @&&&&&&&&&&& &&&&&&&&&&& Airflow Breeze Cheatsheet * Port forwarding: Ports are forwarded to the running docker containers for webserver and database * 12322 -> forwarded to Airflow ssh server -> airflow:22 * 28080 -> forwarded to Airflow webserver -> airflow:8080 * 25555 -> forwarded to Flower dashboard -> airflow:5555 * 25433 -> forwarded to Postgres database -> postgres:5432 * 23306 -> forwarded to MySQL database -> mysql:3306 * 26379 -> forwarded to Redis broker -> redis:6379 Direct links to those services that you can use from the host: * ssh connection for remote debugging: ssh -p 12322 [email protected] (password: airflow) * Webserver: http://127.0.0.1:28080/ * Flower: http://127.0.0.1:25555/ * Postgres: jdbc:postgresql://127.0.0.1:25433/airflow?user=postgres&password=airflow * Mysql: jdbc:mysql://127.0.0.1:23306/airflow?user=root * Redis: redis://127.0.0.1:26379/0 * How can I add my stuff in Breeze: * Your dags for webserver and scheduler are read from `/files/dags` directory which is mounted from folder in Airflow sources: * `/opt/airflow/files/dags` * Your plugins are read from `/files/plugins` directory which is mounted from folder in Airflow sources: * `/opt/airflow/files/plugins` * You can add `airflow-breeze-config` directory. Place it in `/opt/airflow/files/airflow-breeze-config` and: * Add `variables.env` - to make breeze source the variables automatically for you * Add `.tmux.conf` - to add extra initial configuration to `tmux` * Add `init.sh` - this file will be sourced when you enter container, so you can add any custom code there. * Add `requirements. * You can also share other files, put them under `/opt/airflow/files` folder and they will be visible in `/files/` folder inside the container. * Other options Check out `--help` for `breeze` command. It will show you other options, such as running integration or starting complete Airflow using `start-airflow` command as well as ways of cleaning up the installation. Make sure to run `setup-autocomplete` to get the commands and options auto-completable in your shell. You can disable this cheatsheet by running: breeze setup config --no-cheatsheet backend: sqlite is not compatible with executor: LocalExecutor. Changing the executor to SequentialExecutor. [+] Creating 2/2 ✔ Network breeze_default Created 0.1s ✔ Volume "breeze_sqlite-db-volume" Created 0.1s Error response from daemon: invalid mount config for type "bind": bind source path does not exist: /opt/airflow/.dockerignore Error 1 returned docker: Error response from daemon: failed to create task for container: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/opt/airflow/scripts/in_container/run_fix_ownership.py": stat /opt/airflow/scripts/in_container/run_fix_ownership.py: no such file or directory: unknown. ``` ### What you think should happen instead? As stated in the docs, Breeze should come installed with the Devcontainer ### How to reproduce - Create a new codespace. - Run a breeze command - Install breeze ### Operating System Debian GNU/Linux 12 (from devcontainer) ### Versions of Apache Airflow Providers _No response_ ### Deployment Other Docker-based deployment ### Deployment details _No response_ ### Anything else? _No response_ ### Are you willing to submit PR? - [X] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
