This is an automated email from the ASF dual-hosted git repository. jedcunningham pushed a commit to branch v2-8-test in repository https://gitbox.apache.org/repos/asf/airflow.git
commit eff56434936f7198a4906434ef0fed76ca8c68d6 Author: WackyGem <[email protected]> AuthorDate: Fri Mar 8 08:44:17 2024 +0800 doc: add a section about debugging in docker-compose with PyCharm (#37940) * doc: add a section about debugging in docker-compose with PyCharm * doc: revise documentation details and fix static errors (cherry picked from commit 804ba590c58224bde947df11dedb0c8c3a584f59) --- docs/apache-airflow/howto/docker-compose/index.rst | 43 +++++++++++++++++++++ .../img/add_container_python_interpreter.png | Bin 0 -> 339750 bytes 2 files changed, 43 insertions(+) diff --git a/docs/apache-airflow/howto/docker-compose/index.rst b/docs/apache-airflow/howto/docker-compose/index.rst index 087b480993..5bff8e16c7 100644 --- a/docs/apache-airflow/howto/docker-compose/index.rst +++ b/docs/apache-airflow/howto/docker-compose/index.rst @@ -345,6 +345,49 @@ Networking In general, if you want to use Airflow locally, your DAGs may try to connect to servers which are running on the host. In order to achieve that, an extra configuration must be added in ``docker-compose.yaml``. For example, on Linux the configuration must be in the section ``services: airflow-worker`` adding ``extra_hosts: - "host.docker.internal:host-gateway"``; and use ``host.docker.internal`` instead of ``localhost``. This configuration vary in different platforms. Please check the Doc [...] +Debug Airflow inside docker container using PyCharm +=================================================== +.. jinja:: quick_start_ctx + + Prerequisites: Create a project in **PyCharm** and download the (`docker-compose.yaml <{{ doc_root_url }}docker-compose.yaml>`__). + +Steps: + +1) Modify ``docker-compose.yaml`` + + Add the following section under the ``services`` section: + +.. code-block:: yaml + + airflow-python: + <<: *airflow-common + profiles: + - debug + environment: + <<: *airflow-common-env + user: "50000:0" + entrypoint: ["bash"] + +.. note:: + + This code snippet creates a new service named **"airflow-python"** specifically for PyCharm's Python interpreter. + On a Linux system, if you have executed the command ``echo -e "AIRFLOW_UID=$(id -u)" > .env``, you need to set ``user: "50000:0"`` in ``airflow-python`` service to avoid PyCharm's ``Unresolved reference 'airflow'`` error. + +2) Configure PyCharm Interpreter + + * Open PyCharm and navigate to **Settings** (or **Preferences** on macOS) > **Project: <Your Project Name>** > **Python Interpreter**. + * Click the **"Add Interpreter"** button and choose **"On Docker Compose"**. + * In the **Configuration file** field, select your ``docker-compose.yaml`` file. + * In the **Service field**, choose the newly added ``airflow-python`` service. + * Click **"Next"** and follow the prompts to complete the configuration. + +.. image:: /img/add_container_python_interpreter.png + :alt: Configuring the container's Python interpreter in PyCharm, step diagram + +Building the interpreter index might take some time. +Once configured, you can debug your Airflow code within the container environment, mimicking your local setup. + + FAQ: Frequently asked questions =============================== diff --git a/docs/apache-airflow/img/add_container_python_interpreter.png b/docs/apache-airflow/img/add_container_python_interpreter.png new file mode 100644 index 0000000000..c0bd2d7701 Binary files /dev/null and b/docs/apache-airflow/img/add_container_python_interpreter.png differ
