amoghrajesh commented on code in PR #58231:
URL: https://github.com/apache/airflow/pull/58231#discussion_r2523277856


##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -68,13 +68,61 @@ Why Task SDK Integration?
 Running Task SDK Integration Tests
 ----------------------------------
 
-There are multiple ways to run Task SDK Integration Tests depending based on 
your preferences.
+Prerequisite - build PROD image
+...............................
+
+.. note::
+
+   The task-sdk integration tests are using locally build production images 
started in docker-compose by

Review Comment:
   ```suggestion
      The task-sdk integration tests are using locally built production images 
started in docker-compose by
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -68,13 +68,61 @@ Why Task SDK Integration?
 Running Task SDK Integration Tests
 ----------------------------------
 
-There are multiple ways to run Task SDK Integration Tests depending based on 
your preferences.
+Prerequisite - build PROD image
+...............................
+
+.. note::
+
+   The task-sdk integration tests are using locally build production images 
started in docker-compose by
+   Pytest. This means that while the tests are running in the environment that 
you start it from (usually
+   local development environment), you need to first build the images that you 
want to test against.
+
+You also need to make sure that your assets are built first.
+.. code-block:: bash
+
+   # From the Airflow repository root
+   breeze compile-ui-assets
+
+Then, you should build the base image once before running the tests. You can 
do it using Breeze:
+
+.. code-block:: bash
+
+   # From the Airflow repository root
+   breeze prod-image build --python 3.10
+
+The first build may take a while as it needs to download base image, build 
Python, install dependencies
+and set up the environment. Subsequent builds will be much faster as they will 
use cached layers.
+
+You can choose other Python versions supported by Airflow by changing the 
``--python`` argument.
+
+If you use ``breeze`` to run the integration tests and you do not have the 
image built before,
+``breeze`` will prompt you to build it, and the building will proceed 
automatically after 20 seconds
+if you do not answer ``no``.
+
+This will build the right image 
``ghcr.io/apache/airflow/main/prod/python3.10.latest`` (with the right
+Python version) that will be used to run the tests. The ``breeze prod image 
build`` command by default -
+when run from sources of airflow - will use the local sources and build the 
image using ``uv``
+to speed up the build process. Also when building from sources it will check 
if the assets are built
+and will error if they are not. However it will not check if the assets are up 
to date - so make sure
+to run the ``breeze compile-ui-assets`` command above if you have changed any 
UI sources
+and did not build your assets after that.
+
+.. tip::
+
+    Note that you do not need to rebuild the image every time you run the 
tests and change Python sources -
+    because the docker-compose setup we use in tests will automatically mount 
the local Python sources into the
+    container, so you can iterate quickly without rebuilding the image. 
However, if you want to test changes
+    that require new image (like modifying dependencies, system packages, 
rebuilding UI etc.) you will need
+    to rebuild the image with the ``breeze prod image build`` command.
+
+After you build the image, there are several ways to run Task SDK Integration 
Tests,
+depending based on your preferences.

Review Comment:
   ```suggestion
   depending based on your preferences. The ways are listed below.
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -87,36 +135,103 @@ reproducibility:
    # Run with custom Docker image
    DOCKER_IMAGE=my-custom-airflow-image:latest breeze testing 
task-sdk-integration-tests
 
-Running in Your Current Virtual Environment
-...........................................
+Using uv
+........
 
-Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests
-directly:
+Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests directly.
+Make sure you have ``uv`` installed in your environment and make sure to run 
all those commands
+in the ``task-sdk-integration-tests`` directory. You can also run ``uv sync`` 
in the directory
+first to make sure that your virtual environment is up to date.
+
+.. code-block::
+
+    # Navigate to task-sdk-integration-tests directory
+    cd task-sdk-integration-tests
+
+    # Sync dependencies
+    uv sync
+
+All the ``uv`` and ``docker compose`` commands below should be run from within 
the
+``task-sdk-integration-tests`` directory.
 
 **Run Tests**
 
 .. code-block:: bash
 
    # Navigate to task-sdk-integration-tests directory and run tests
-   cd task-sdk-integration-tests/
    uv run pytest -s
 
    # Run specific test file
-   cd task-sdk-integration-tests/
    uv run pytest tests/task_sdk_tests/test_task_sdk_health.py -s
 
-   # Keep containers running for debugging
-   cd task-sdk-integration-tests/
-   SKIP_DOCKER_COMPOSE_DELETION=1 uv run pytest -s
-
 **Optional: Set Custom Docker Image**
 
 .. code-block:: bash
 
    # Use a different Airflow image for testing
-   cd task-sdk-integration-tests/
    DOCKER_IMAGE=my-custom-airflow:latest uv run pytest -s
 
+By default when you run your tests locally, the Docker Compose deployment is 
kept between the sessions,
+your local sources are mounted into the containers and the Airflow services 
are restarted automatically
+(hot reloaded) when Python sources change.
+
+This allows for quick iterations without rebuilding the image or restarting 
the containers.
+
+Generated .env file
+...................
+
+When you run the tests an .env file is generated in the 
task-sdk-integration-tests directory.
+This file contains environment variables used by docker-compose to configure 
the services.
+You can inspect or modify this file if you need to change any configurations 
so that you can
+also debug issues by running ``docker compose`` commands directly.
+
+When running the tests with VERBOSE=1 environment variable set or --verbose 
flag passed to breeze command,
+the docker-compose commands used to start the services are also printed to the 
console and you can copy
+them to run them directly.
+
+Stopping docker-compose
+.......................
+
+When you finish testing (or when you updated dependencies and rebuild your 
images),
+you likely want to stop the running containers. You can stop the the running 
containers by running:
+
+.. code-block:: bash
+
+   # Stop and remove containers
+   docker-compose down -v --remove-orphans
+
+amd with breeze:
+
+
+.. code-block:: bash
+
+   # Using Breeze to stop docker compose
+   breeze testing task-sdk-integration-tests --down
+
+Docker compose will be automatically started again next time you run the tests.
+
+Running tests in the way CI does it
+....................................
+
+Our CI runs the tests in a clean environment every time without mounting local 
sources. This means that
+any changes you have locally will not be visible inside the containers. You 
can reproduce it locally by adding
+--skip-mounting-local-volumes to breeze command or by setting 
SKIP_MOUNTING_LOCAL_VOLUMES=1 in your

Review Comment:
   ```suggestion
   ``--skip-mounting-local-volumes`` to breeze command or by setting 
``SKIP_MOUNTING_LOCAL_VOLUMES=1`` in your
   ```



##########
dev/breeze/src/airflow_breeze/utils/run_tests.py:
##########
@@ -43,9 +43,9 @@
 DOCKER_TESTS_TESTS_MODULE_PATH = DOCKER_TESTS_ROOT_PATH / "tests" / 
"docker_tests"
 DOCKER_TESTS_REQUIREMENTS = DOCKER_TESTS_ROOT_PATH / "requirements.txt"
 
-TASK_SDK_TESTS_ROOT_PATH = AIRFLOW_ROOT_PATH / "task-sdk-integration-tests"
-TASK_SDK_TESTS_TESTS_MODULE_PATH = TASK_SDK_TESTS_ROOT_PATH / "tests" / 
"task_sdk_tests"
-TASK_SDK_TESTS_REQUIREMENTS = TASK_SDK_TESTS_ROOT_PATH / "requirements.txt"
+TASK_SDK_INTEGRATION_TESTS_ROOT_PATH = AIRFLOW_ROOT_PATH / 
"task-sdk-integration-tests"
+TASK_SDK_TESTS_TESTS_MODULE_PATH = TASK_SDK_INTEGRATION_TESTS_ROOT_PATH / 
"tests" / "task_sdk_tests"
+TASK_SDK_TESTS_REQUIREMENTS = TASK_SDK_INTEGRATION_TESTS_ROOT_PATH / 
"requirements.txt"

Review Comment:
   There are no requirements.txt defined?



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -87,36 +135,103 @@ reproducibility:
    # Run with custom Docker image
    DOCKER_IMAGE=my-custom-airflow-image:latest breeze testing 
task-sdk-integration-tests
 
-Running in Your Current Virtual Environment
-...........................................
+Using uv
+........
 
-Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests
-directly:
+Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests directly.
+Make sure you have ``uv`` installed in your environment and make sure to run 
all those commands
+in the ``task-sdk-integration-tests`` directory. You can also run ``uv sync`` 
in the directory
+first to make sure that your virtual environment is up to date.
+
+.. code-block::
+
+    # Navigate to task-sdk-integration-tests directory
+    cd task-sdk-integration-tests
+
+    # Sync dependencies
+    uv sync
+
+All the ``uv`` and ``docker compose`` commands below should be run from within 
the
+``task-sdk-integration-tests`` directory.
 
 **Run Tests**
 
 .. code-block:: bash
 
    # Navigate to task-sdk-integration-tests directory and run tests
-   cd task-sdk-integration-tests/
    uv run pytest -s
 
    # Run specific test file
-   cd task-sdk-integration-tests/
    uv run pytest tests/task_sdk_tests/test_task_sdk_health.py -s
 
-   # Keep containers running for debugging
-   cd task-sdk-integration-tests/
-   SKIP_DOCKER_COMPOSE_DELETION=1 uv run pytest -s
-
 **Optional: Set Custom Docker Image**
 
 .. code-block:: bash
 
    # Use a different Airflow image for testing
-   cd task-sdk-integration-tests/
    DOCKER_IMAGE=my-custom-airflow:latest uv run pytest -s
 
+By default when you run your tests locally, the Docker Compose deployment is 
kept between the sessions,
+your local sources are mounted into the containers and the Airflow services 
are restarted automatically
+(hot reloaded) when Python sources change.
+
+This allows for quick iterations without rebuilding the image or restarting 
the containers.
+
+Generated .env file
+...................
+
+When you run the tests an .env file is generated in the 
task-sdk-integration-tests directory.
+This file contains environment variables used by docker-compose to configure 
the services.
+You can inspect or modify this file if you need to change any configurations 
so that you can
+also debug issues by running ``docker compose`` commands directly.
+
+When running the tests with VERBOSE=1 environment variable set or --verbose 
flag passed to breeze command,

Review Comment:
   ```suggestion
   When running the tests with ``VERBOSE=1`` environment variable set or 
``--verbose`` flag passed to breeze command,
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -68,13 +68,61 @@ Why Task SDK Integration?
 Running Task SDK Integration Tests
 ----------------------------------
 
-There are multiple ways to run Task SDK Integration Tests depending based on 
your preferences.
+Prerequisite - build PROD image
+...............................
+
+.. note::
+
+   The task-sdk integration tests are using locally build production images 
started in docker-compose by
+   Pytest. This means that while the tests are running in the environment that 
you start it from (usually
+   local development environment), you need to first build the images that you 
want to test against.
+
+You also need to make sure that your assets are built first.
+.. code-block:: bash
+
+   # From the Airflow repository root
+   breeze compile-ui-assets
+
+Then, you should build the base image once before running the tests. You can 
do it using Breeze:
+
+.. code-block:: bash
+
+   # From the Airflow repository root
+   breeze prod-image build --python 3.10
+
+The first build may take a while as it needs to download base image, build 
Python, install dependencies
+and set up the environment. Subsequent builds will be much faster as they will 
use cached layers.
+
+You can choose other Python versions supported by Airflow by changing the 
``--python`` argument.
+
+If you use ``breeze`` to run the integration tests and you do not have the 
image built before,
+``breeze`` will prompt you to build it, and the building will proceed 
automatically after 20 seconds
+if you do not answer ``no``.
+
+This will build the right image 
``ghcr.io/apache/airflow/main/prod/python3.10.latest`` (with the right
+Python version) that will be used to run the tests. The ``breeze prod image 
build`` command by default -
+when run from sources of airflow - will use the local sources and build the 
image using ``uv``
+to speed up the build process. Also when building from sources it will check 
if the assets are built

Review Comment:
   ```suggestion
   to speed up the build process. Also, when building from sources it will 
check if the assets are built
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -87,36 +135,103 @@ reproducibility:
    # Run with custom Docker image
    DOCKER_IMAGE=my-custom-airflow-image:latest breeze testing 
task-sdk-integration-tests
 
-Running in Your Current Virtual Environment
-...........................................
+Using uv
+........
 
-Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests
-directly:
+Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests directly.
+Make sure you have ``uv`` installed in your environment and make sure to run 
all those commands

Review Comment:
   ```suggestion
   Make sure you have ``uv`` installed in your environment and make sure to run 
all these commands
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -87,36 +135,103 @@ reproducibility:
    # Run with custom Docker image
    DOCKER_IMAGE=my-custom-airflow-image:latest breeze testing 
task-sdk-integration-tests
 
-Running in Your Current Virtual Environment
-...........................................
+Using uv
+........
 
-Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests
-directly:
+Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests directly.
+Make sure you have ``uv`` installed in your environment and make sure to run 
all those commands
+in the ``task-sdk-integration-tests`` directory. You can also run ``uv sync`` 
in the directory
+first to make sure that your virtual environment is up to date.
+
+.. code-block::
+
+    # Navigate to task-sdk-integration-tests directory
+    cd task-sdk-integration-tests
+
+    # Sync dependencies
+    uv sync
+
+All the ``uv`` and ``docker compose`` commands below should be run from within 
the
+``task-sdk-integration-tests`` directory.
 
 **Run Tests**
 
 .. code-block:: bash
 
    # Navigate to task-sdk-integration-tests directory and run tests
-   cd task-sdk-integration-tests/
    uv run pytest -s
 
    # Run specific test file
-   cd task-sdk-integration-tests/
    uv run pytest tests/task_sdk_tests/test_task_sdk_health.py -s
 
-   # Keep containers running for debugging
-   cd task-sdk-integration-tests/
-   SKIP_DOCKER_COMPOSE_DELETION=1 uv run pytest -s
-
 **Optional: Set Custom Docker Image**
 
 .. code-block:: bash
 
    # Use a different Airflow image for testing
-   cd task-sdk-integration-tests/
    DOCKER_IMAGE=my-custom-airflow:latest uv run pytest -s
 
+By default when you run your tests locally, the Docker Compose deployment is 
kept between the sessions,
+your local sources are mounted into the containers and the Airflow services 
are restarted automatically
+(hot reloaded) when Python sources change.

Review Comment:
   ```suggestion
   By default when you run your tests locally, the Docker Compose deployment is 
kept between the sessions,
   your local sources are mounted into the containers and the Airflow services 
are restarted automatically with hot reloading when any Python sources change.
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -87,36 +135,103 @@ reproducibility:
    # Run with custom Docker image
    DOCKER_IMAGE=my-custom-airflow-image:latest breeze testing 
task-sdk-integration-tests
 
-Running in Your Current Virtual Environment
-...........................................
+Using uv
+........
 
-Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests
-directly:
+Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests directly.
+Make sure you have ``uv`` installed in your environment and make sure to run 
all those commands
+in the ``task-sdk-integration-tests`` directory. You can also run ``uv sync`` 
in the directory
+first to make sure that your virtual environment is up to date.
+
+.. code-block::
+
+    # Navigate to task-sdk-integration-tests directory
+    cd task-sdk-integration-tests
+
+    # Sync dependencies
+    uv sync
+
+All the ``uv`` and ``docker compose`` commands below should be run from within 
the
+``task-sdk-integration-tests`` directory.
 
 **Run Tests**
 
 .. code-block:: bash
 
    # Navigate to task-sdk-integration-tests directory and run tests
-   cd task-sdk-integration-tests/
    uv run pytest -s
 
    # Run specific test file
-   cd task-sdk-integration-tests/
    uv run pytest tests/task_sdk_tests/test_task_sdk_health.py -s
 
-   # Keep containers running for debugging
-   cd task-sdk-integration-tests/
-   SKIP_DOCKER_COMPOSE_DELETION=1 uv run pytest -s
-

Review Comment:
   Lol i didn't realise so many `cd`'s



##########
task-sdk-integration-tests/docker-compose.yaml:
##########
@@ -18,7 +18,7 @@
 ---
 x-airflow-common:
   &airflow-common
-  image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:3.0.3}

Review Comment:
   Ah damn, good catch



##########
dev/breeze/src/airflow_breeze/commands/common_options.py:
##########
@@ -413,12 +413,12 @@ def _set_default_from_parent(ctx: click.core.Context, 
option: click.core.Option,
     help="Use uv instead of pip as packaging tool to build the image.",
     envvar="USE_UV",
 )
-option_use_uv_default_disabled = click.option(
+option_use_uv_default_depends_on_installation_method = click.option(
     "--use-uv/--no-use-uv",
     is_flag=True,
-    default=False,
-    show_default=True,
-    help="Use uv instead of pip as packaging tool to build the image.",
+    default=None,

Review Comment:
   `False` is a better default?



##########
dev/breeze/src/airflow_breeze/utils/run_tests.py:
##########
@@ -43,9 +43,9 @@
 DOCKER_TESTS_TESTS_MODULE_PATH = DOCKER_TESTS_ROOT_PATH / "tests" / 
"docker_tests"
 DOCKER_TESTS_REQUIREMENTS = DOCKER_TESTS_ROOT_PATH / "requirements.txt"
 
-TASK_SDK_TESTS_ROOT_PATH = AIRFLOW_ROOT_PATH / "task-sdk-integration-tests"
-TASK_SDK_TESTS_TESTS_MODULE_PATH = TASK_SDK_TESTS_ROOT_PATH / "tests" / 
"task_sdk_tests"
-TASK_SDK_TESTS_REQUIREMENTS = TASK_SDK_TESTS_ROOT_PATH / "requirements.txt"
+TASK_SDK_INTEGRATION_TESTS_ROOT_PATH = AIRFLOW_ROOT_PATH / 
"task-sdk-integration-tests"
+TASK_SDK_TESTS_TESTS_MODULE_PATH = TASK_SDK_INTEGRATION_TESTS_ROOT_PATH / 
"tests" / "task_sdk_tests"
+TASK_SDK_TESTS_REQUIREMENTS = TASK_SDK_INTEGRATION_TESTS_ROOT_PATH / 
"requirements.txt"

Review Comment:
   ```suggestion
   TASK_SDK_INTEGRATION_TESTS_MODULE_PATH = 
TASK_SDK_INTEGRATION_TESTS_ROOT_PATH / "tests" / "task_sdk_tests"
   ```



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -206,10 +317,44 @@ The Task SDK Integration Tests are organized as follows:
 **Key Files:**
 
 - **docker-compose.yaml**: Defines the complete Airflow environment (postgres, 
scheduler, api-server)
+- **docker-compose.yaml**: Defines mounts used to mount local sources to the 
containers for local testing

Review Comment:
   ```suggestion
   - **docker-compose-local.yaml**: Defines mounts used to mount local sources 
to the containers for local testing
   ```



##########
task-sdk-integration-tests/logs/.gitignore:
##########
@@ -0,0 +1,2 @@
+# ignore all files generated in logs directory

Review Comment:
   Nice



##########
task-sdk-integration-tests/docker-compose-local.yaml:
##########
@@ -0,0 +1,45 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+---
+# If the image is locally built from the repo we can mount sources of Airflow 
so that
+# We do not have to rebuild the image on every change
+services:
+  airflow-init:
+    volumes:
+      - ../airflow-core/src:/opt/airflow/airflow-core/src
+      - ../task-sdk/src:/opt/airflow/task-sdk/src
+    environment:
+      - DEV_MODE=true
+  airflow-apiserver:
+    volumes:
+      - ../airflow-core/src:/opt/airflow/airflow-core/src
+      - ../task-sdk/src:/opt/airflow/task-sdk/src
+    environment:
+      - DEV_MODE=true
+  airflow-scheduler:
+    volumes:
+      - ../airflow-core/src:/opt/airflow/airflow-core/src
+      - ../task-sdk/src:/opt/airflow/task-sdk/src
+    environment:
+      - DEV_MODE=true
+  airflow-dag-processor:
+    volumes:
+      - ../airflow-core/src:/opt/airflow/airflow-core/src
+      - ../task-sdk/src:/opt/airflow/task-sdk/src
+    environment:
+      - DEV_MODE=true

Review Comment:
   Can we define common `env` instead?



##########
contributing-docs/testing/task_sdk_integration_tests.rst:
##########
@@ -87,36 +135,103 @@ reproducibility:
    # Run with custom Docker image
    DOCKER_IMAGE=my-custom-airflow-image:latest breeze testing 
task-sdk-integration-tests
 
-Running in Your Current Virtual Environment
-...........................................
+Using uv
+........
 
-Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests
-directly:
+Since you're already working in the Airflow repository, you can run Task SDK 
Integration Tests directly.
+Make sure you have ``uv`` installed in your environment and make sure to run 
all those commands
+in the ``task-sdk-integration-tests`` directory. You can also run ``uv sync`` 
in the directory
+first to make sure that your virtual environment is up to date.
+
+.. code-block::
+
+    # Navigate to task-sdk-integration-tests directory
+    cd task-sdk-integration-tests
+
+    # Sync dependencies
+    uv sync
+
+All the ``uv`` and ``docker compose`` commands below should be run from within 
the
+``task-sdk-integration-tests`` directory.
 
 **Run Tests**
 
 .. code-block:: bash
 
    # Navigate to task-sdk-integration-tests directory and run tests
-   cd task-sdk-integration-tests/
    uv run pytest -s
 
    # Run specific test file
-   cd task-sdk-integration-tests/
    uv run pytest tests/task_sdk_tests/test_task_sdk_health.py -s
 
-   # Keep containers running for debugging
-   cd task-sdk-integration-tests/
-   SKIP_DOCKER_COMPOSE_DELETION=1 uv run pytest -s
-
 **Optional: Set Custom Docker Image**
 
 .. code-block:: bash
 
    # Use a different Airflow image for testing
-   cd task-sdk-integration-tests/
    DOCKER_IMAGE=my-custom-airflow:latest uv run pytest -s
 
+By default when you run your tests locally, the Docker Compose deployment is 
kept between the sessions,
+your local sources are mounted into the containers and the Airflow services 
are restarted automatically
+(hot reloaded) when Python sources change.
+
+This allows for quick iterations without rebuilding the image or restarting 
the containers.
+
+Generated .env file
+...................
+
+When you run the tests an .env file is generated in the 
task-sdk-integration-tests directory.
+This file contains environment variables used by docker-compose to configure 
the services.
+You can inspect or modify this file if you need to change any configurations 
so that you can
+also debug issues by running ``docker compose`` commands directly.
+
+When running the tests with VERBOSE=1 environment variable set or --verbose 
flag passed to breeze command,
+the docker-compose commands used to start the services are also printed to the 
console and you can copy
+them to run them directly.
+
+Stopping docker-compose
+.......................
+
+When you finish testing (or when you updated dependencies and rebuild your 
images),
+you likely want to stop the running containers. You can stop the the running 
containers by running:
+
+.. code-block:: bash
+
+   # Stop and remove containers
+   docker-compose down -v --remove-orphans
+
+amd with breeze:

Review Comment:
   ```suggestion
   and with breeze:
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to