This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 99d37e2d35a9d82103b35e4042c27a7f5620b568
Author: Jarek Potiuk <[email protected]>
AuthorDate: Wed Jul 1 16:02:24 2020 +0200

    Update Breeze documentation (#9608)
    
    * Update Breeze documentation
    
    (cherry picked from commit f3e1f9a313d8a6f841f6a5c9f2663518fee16b8f)
---
 BREEZE.rst  | 293 ++++++++++++++++++++++++++++++++++++++++--------------------
 TESTING.rst |   2 +-
 2 files changed, 198 insertions(+), 97 deletions(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index 9b318e2..735286a 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -232,44 +232,6 @@ from your ``logs`` directory in the Airflow sources, so 
all logs created in the
 visible in the host as well. Every time you enter the container, the ``logs`` 
directory is
 cleaned so that logs do not accumulate.
 
-CLIs for cloud providers
-========================
-
-For development convenience we installed simple wrappers for the most common 
cloud providers CLIs. Those
-CLIs are not installed when you build or pull the image - they will be 
downloaded as docker images
-the first time you attempt to use them. It is downloaded and executed in your 
host's docker engine so once
-it is downloaded, it will stay until you remove the downloaded images from 
your host container.
-
-For each of those CLI credentials are taken (automatically) from the 
credentials you have defined in
-your ${HOME} directory on host.
-
-Those tools also have host Airflow source directory mounted in /opt/airflow 
path
-so you can directly transfer files to/from your airflow host sources.
-
-Those are currently installed CLIs (they are available as aliases to the 
docker commands):
-
-+-----------------------+----------+-------------------------------------------------+-------------------+
-| Cloud Provider        | CLI tool | Docker image                              
      | Configuration dir |
-+=======================+==========+=================================================+===================+
-| Amazon Web Services   | aws      | amazon/aws-cli:latest                     
      | .aws              |
-+-----------------------+----------+-------------------------------------------------+-------------------+
-| Microsoft Azure       | az       | mcr.microsoft.com/azure-cli:latest        
      | .azure            |
-+-----------------------+----------+-------------------------------------------------+-------------------+
-| Google Cloud Platform | bq       | 
gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud    |
-|                       
+----------+-------------------------------------------------+-------------------+
-|                       | gcloud   | 
gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud    |
-|                       
+----------+-------------------------------------------------+-------------------+
-|                       | gsutil   | 
gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud    |
-+-----------------------+----------+-------------------------------------------------+-------------------+
-
-For each of the CLIs we have also an accompanying ``*-update`` alias (for 
example ``aws-update``) which
-will pull the latest image for the tool. Note that all Google Cloud Platform 
tools are served by one
-image and they are updated together.
-
-Also - in case you run several different Breeze containers in parallel (from 
different directories,
-with different versions) - they docker images for CLI Cloud Providers tools 
are shared so if you update it
-for one Breeze container, they will also get updated for all the other 
containers.
-
 Using the Airflow Breeze Environment
 =====================================
 
@@ -287,6 +249,7 @@ Managing CI environment:
     * Stop running interactive environment with ``breeze stop`` command
     * Restart running interactive environment with ``breeze restart`` command
     * Run test specified with ``breeze tests`` command
+    * Generate requirements with ``breeze generate-requirements`` command
     * Execute arbitrary command in the test environment with ``breeze shell`` 
command
     * Execute arbitrary docker-compose command with ``breeze docker-compose`` 
command
     * Push docker images with ``breeze push-image`` command (require 
committer's rights to push images)
@@ -319,7 +282,7 @@ Manage and Interact with Kubernetes tests environment:
 Run static checks:
 
     * Run static checks - either for currently staged change or for all files 
with
-      ``breeze static-check`` or ``breeze static-check-all-files`` command
+      ``breeze static-check`` command
 
 Build documentation:
 
@@ -330,10 +293,12 @@ Set up local development environment:
     * Setup local virtualenv with ``breeze setup-virtualenv`` command
     * Setup autocomplete for itself with ``breeze setup-autocomplete`` command
 
-
-Entering Breeze CI environment
+Interactive Breeze Environment
 ------------------------------
 
+Entering Breeze environment
+...........................
+
 You enter the Breeze test environment by running the ``./breeze`` script. You 
can run it with
 the ``help`` command to see the list of available options. See `Breeze 
Command-Line Interface Reference`_
 for details.
@@ -359,8 +324,15 @@ When you enter the Breeze environment, automatically an 
environment file is sour
 automatically mounted to the container under ``/files`` path and you can put 
there any files you want
 to make available for the Breeze container.
 
+Running tests in the CI interactive environment
+-----------------------------------------------
+
+Breeze helps with running tests in the same environment/way as CI tests are 
run. You can run various
+types of tests while you enter Breeze CI interactive environment - this is 
described in detail
+in `<TESTING.rst>`_
+
 Launching multiple terminals
-----------------------------
+............................
 
 Often if you want to run full airflow in the Breeze environment you need to 
launch multiple terminals and
 run ``airflow webserver``, ``airflow scheduler``, ``airflow worker`` in 
separate terminals.
@@ -377,8 +349,47 @@ to enter the running container. It's as easy as launching 
``breeze exec`` while
 Breeze environment. You will be dropped into bash and environment variables 
will be read in the same
 way as when you enter the environment. You can do it multiple times and open 
as many terminals as you need.
 
+
+CLIs for cloud providers
+........................
+
+For development convenience we installed simple wrappers for the most common 
cloud providers CLIs. Those
+CLIs are not installed when you build or pull the image - they will be 
downloaded as docker images
+the first time you attempt to use them. It is downloaded and executed in your 
host's docker engine so once
+it is downloaded, it will stay until you remove the downloaded images from 
your host container.
+
+For each of those CLI credentials are taken (automatically) from the 
credentials you have defined in
+your ${HOME} directory on host.
+
+Those tools also have host Airflow source directory mounted in /opt/airflow 
path
+so you can directly transfer files to/from your airflow host sources.
+
+Those are currently installed CLIs (they are available as aliases to the 
docker commands):
+
++-----------------------+----------+-------------------------------------------------+-------------------+
+| Cloud Provider        | CLI tool | Docker image                              
      | Configuration dir |
++=======================+==========+=================================================+===================+
+| Amazon Web Services   | aws      | amazon/aws-cli:latest                     
      | .aws              |
++-----------------------+----------+-------------------------------------------------+-------------------+
+| Microsoft Azure       | az       | mcr.microsoft.com/azure-cli:latest        
      | .azure            |
++-----------------------+----------+-------------------------------------------------+-------------------+
+| Google Cloud Platform | bq       | 
gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud    |
+|                       
+----------+-------------------------------------------------+-------------------+
+|                       | gcloud   | 
gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud    |
+|                       
+----------+-------------------------------------------------+-------------------+
+|                       | gsutil   | 
gcr.io/google.com/cloudsdktool/cloud-sdk:latest | .config/gcloud    |
++-----------------------+----------+-------------------------------------------------+-------------------+
+
+For each of the CLIs we have also an accompanying ``*-update`` alias (for 
example ``aws-update``) which
+will pull the latest image for the tool. Note that all Google Cloud Platform 
tools are served by one
+image and they are updated together.
+
+Also - in case you run several different Breeze containers in parallel (from 
different directories,
+with different versions) - they docker images for CLI Cloud Providers tools 
are shared so if you update it
+for one Breeze container, they will also get updated for all the other 
containers.
+
 Stopping Interactive environment
---------------------------------
+................................
 
 After starting up, the environment runs in the background and takes precious 
memory.
 You can always stop it via:
@@ -388,7 +399,7 @@ You can always stop it via:
    ./breeze stop
 
 Restarting Breeze environment
------------------------------
+.............................
 
 You can also  restart the environment and enter it via:
 
@@ -396,8 +407,8 @@ You can also  restart the environment and enter it via:
 
    ./breeze restart
 
-Choosing a Breeze Environment
------------------------------
+Choosing different Breeze Environment configuration
+...................................................
 
 You can use additional ``breeze`` flags to customize your environment. For 
example, you can specify a Python
 version to use, backend and a container environment for testing. With Breeze, 
you can recreate the same
@@ -417,8 +428,8 @@ default settings.
 
 The defaults when you run the Breeze environment are Python 3.6, Sqlite, and 
Docker.
 
-Launching Breeze Integrations
------------------------------
+Launching Breeze integrations
+.............................
 
 When Breeze starts, it can start additional integrations. Those are additional 
docker containers
 that are started in the same docker-compose command. Those are required by 
some of the tests
@@ -436,8 +447,65 @@ Once integration is started, it will continue to run until 
the environment is st
 
 Note that running integrations uses significant resources - CPU and memory.
 
+Mounting Local Sources to Breeze
+................................
+
+Important sources of Airflow are mounted inside the ``airflow`` container that 
you enter.
+This means that you can continue editing your changes on the host in your 
favourite IDE and have them
+visible in the Docker immediately and ready to test without rebuilding images. 
You can disable mounting
+by specifying ``--skip-mounting-local-sources`` flag when running Breeze. In 
this case you will have sources
+embedded in the container and changes to these sources will not be persistent.
+
+
+After you run Breeze for the first time, you will have empty directory 
``files`` in your source code,
+which will be mapped to ``/files`` in your Docker container. You can pass 
there any files you need to
+configure and run Docker. They will not be removed between Docker runs.
+
+By default ``/files/dags`` folder is mounted from your local 
``<AIRFLOW_SOURCES>/files/dags`` and this is
+the directory used by airflow scheduler and webserver to scan dags for. You 
can use it to test your dags
+from local sources in Airflow. If you wish to add local DAGs that can be run 
by Breeze.
+
+Port Forwarding
+...............
+
+When you run Airflow Breeze, the following ports are automatically forwarded:
+
+* 28080 -> forwarded to Airflow webserver -> airflow:8080
+* 25433 -> forwarded to Postgres database -> postgres:5432
+* 23306 -> forwarded to MySQL database  -> mysql:3306
+
+You can connect to these ports/databases using:
+
+* Webserver: ``http://127.0.0.1:28080``
+* Postgres: 
``jdbc:postgresql://127.0.0.1:25433/airflow?user=postgres&password=airflow``
+* Mysql: ``jdbc:mysql://localhost:23306/airflow?user=root``
+
+Start the webserver manually with the ``airflow webserver`` command if you 
want to connect
+to the webserver. You can use ``tmux`` to multiply terminals. You may need to 
create a user prior to
+running the webserver in order to log in. This can be done with the following 
command:
+
+.. code-block:: bash
+
+    airflow users create --role Admin --username admin --password admin 
--email [email protected] --firstname foo --lastname bar
+
+For databases, you need to run ``airflow db reset`` at least once (or run some 
tests) after you started
+Airflow Breeze to get the database/tables created. You can connect to 
databases with IDE or any other
+database client:
+
+.. image:: images/database_view.png
+    :align: center
+    :alt: Database view
+
+You can change the used host port numbers by setting appropriate environment 
variables:
+
+* ``WEBSERVER_HOST_PORT``
+* ``POSTGRES_HOST_PORT``
+* ``MYSQL_HOST_PORT``
+
+If you set these variables, next time when you enter the environment the new 
ports should be in effect.
+
 Cleaning the Environment
-------------------------
+........................
 
 You may need to clean up your Docker environment occasionally. The images are 
quite big
 (1.5GB for both images needed for static code analysis and CI tests) and, if 
you often rebuild/update
@@ -459,54 +527,74 @@ command. You may need to restart the Docker Engine before 
running this command.
 In case of disk space errors on macOS, increase the disk space available for 
Docker. See
 `Prerequisites <#prerequisites>`_ for details.
 
-Running Arbitrary Commands in the Breeze Environment
--------------------------------------------------------
+Running static checks
+---------------------
 
-To run other commands/executables inside the Breeze Docker-based environment, 
use the
-``./breeze execute-command`` command. To add arguments, specify them
-together with the command surrounded with either ``"`` or ``'``, or pass them 
after ``--`` as extra arguments.
+You can run static checks via Breeze. You can also run them via pre-commit 
command but with auto-completion
+Breeze makes it easier to run selective static checks. If you press <TAB> 
after the static-check and if
+you have auto-complete setup you should see auto-completable list of all 
checks available.
 
 .. code-block:: bash
 
-     ./breeze execute-command "ls -la"
+     ./breeze static-check mypy
+
+The above will run mypy check for currently staged files.
+
+You can also add arbitrary pre-commit flag after ``--``
 
 .. code-block:: bash
 
-     ./breeze execute-command ls -- --la
+     ./breeze static-check mypy -- --all-files
 
+The above will run mypy check for all files.
 
-Running Docker Compose Commands
--------------------------------
+Running Kubernetes tests in virtual environment
+-----------------------------------------------
 
-To run Docker Compose commands (such as ``help``, ``pull``, etc), use the
-``docker-compose`` command. To add extra arguments, specify them
-after ``--`` as extra arguments.
+Breeze helps with running Kubernetes tests in the same environment/way as CI 
tests are run.
+Breeze helps to setup KinD cluster for testing, setting up virtualenv and 
downloads the right tools
+automatically to run the tests.
+
+This is described in `Testing Kubernetes 
<TESTING.rst#running-tests-with-kubernetes>`_ in detail.
+
+Building the Documentation
+--------------------------
+
+To build documentation in Breeze, use the ``build-docs`` command:
 
 .. code-block:: bash
 
-     ./breeze docker-compose pull -- --ignore-pull-failures
+     ./breeze build-docs
 
+Results of the build can be found in the ``docs/_build`` folder.
 
-Mounting Local Sources to Breeze
---------------------------------
+Often errors during documentation generation come from the docstrings of 
auto-api generated classes.
+During the docs building auto-api generated files are stored in the 
``docs/_api`` folder. This helps you
+easily identify the location the problems with documentation originated from.
 
-Important sources of Airflow are mounted inside the ``airflow`` container that 
you enter.
-This means that you can continue editing your changes on the host in your 
favourite IDE and have them
-visible in the Docker immediately and ready to test without rebuilding images. 
You can disable mounting
-by specifying ``--skip-mounting-local-sources`` flag when running Breeze. In 
this case you will have sources
-embedded in the container and changes to these sources will not be persistent.
+Running Arbitrary Commands in the Breeze Environment
+----------------------------------------------------
 
+To run other commands/executables inside the Breeze Docker-based environment, 
use the
+``./breeze shell`` command. You should add your command as -c "command" after 
``--`` as extra arguments.
 
-After you run Breeze for the first time, you will have empty directory 
``files`` in your source code,
-which will be mapped to ``/files`` in your Docker container. You can pass 
there any files you need to
-configure and run Docker. They will not be removed between Docker runs.
+.. code-block:: bash
 
-By default ``/files/dags`` folder is mounted from your local 
``<AIRFLOW_SOURCES>/files/dags`` and this is
-the directory used by airflow scheduler and webserver to scan dags for. You 
can use it to test your dags
-from local sources in Airflow. If you wish to add local DAGs that can be run 
by Breeze.
+     breeze shell -- -c "ls -la"
+
+Running Docker Compose Commands
+-------------------------------
+
+To run Docker Compose commands (such as ``help``, ``pull``, etc), use the
+``docker-compose`` command. To add extra arguments, specify them
+after ``--`` as extra arguments.
 
-Adding/Modifying Dependencies
------------------------------
+.. code-block:: bash
+
+     ./breeze docker-compose pull -- --ignore-pull-failures
+
+Managing Dependencies
+---------------------
 
 If you need to change apt dependencies in the ``Dockerfile.ci``, add Python 
packages in ``setup.py`` or
 add javascript dependencies in ``package.json``, you can either add 
dependencies temporarily for a single
@@ -531,8 +619,34 @@ asks you to confirm rebuilding the image and proceeds with 
rebuilding if you con
 if you do not confirm). After rebuilding is done, Breeze drops you to shell. 
You may also use the
 ``build-image`` command to only rebuild CI image and not to go into shell.
 
-Changing apt Dependencies in the Dockerfile.ci
-..............................................
+Generating requirements
+.......................
+
+Whenever you modify and commit setup.py, you need to re-generate requirement 
files. Those requirement
+files ara stored separately for each python version in the ``requirements`` 
folder. Those are
+constraints rather than requirements as described in detail in the
+`CONTRIBUTING.rst <CONTRIBUTING.rst#pinned-requirement-files>`_ contributing 
documentation.
+
+In case you modify setup.py you need to update the requirements - for every 
python version supported.
+
+.. code-block:: bash
+
+  breeze generate-requirements --python 3.6
+
+.. code-block:: bash
+
+  breeze generate-requirements --python 3.7
+
+.. code-block:: bash
+
+  breeze generate-requirements --python 3.8
+
+
+This bumps requirements to latest versions and stores hash of setup.py so that 
we are automatically
+upgrading the requirements as we add new ones.
+
+Incremental apt Dependencies in the Dockerfile.ci during development
+....................................................................
 
 During development, changing dependencies in ``apt-get`` closer to the top of 
the ``Dockerfile.ci``
 invalidates cache for most of the image. It takes long time for Breeze to 
rebuild the image.
@@ -581,8 +695,8 @@ You can change the used host port numbers by setting 
appropriate environment var
 
 If you set these variables, next time when you enter the environment the new 
ports should be in effect.
 
-Setting Up Autocompletion
--------------------------
+Setting Up the auto-completion
+------------------------------
 
 The ``breeze`` command comes with a built-in bash/zsh autocomplete option for 
its options. When you start typing
 the command, you can use <TAB> to show all the available switches and get 
autocompletion on typical
@@ -599,8 +713,8 @@ You get the autocompletion working when you re-enter the 
shell.
 Zsh autocompletion is currently limited to only autocomplete options. Bash 
autocompletion also completes
 options values (for example, Python version or static check name).
 
-Setting Defaults for User Interaction
---------------------------------------
+Setting default answers for User Interaction
+--------------------------------------------
 
 Sometimes during the build, you are asked whether to perform an action, skip 
it, or quit. This happens
 when rebuilding or removing an image - actions that take a lot of time and 
could be potentially destructive.
@@ -664,19 +778,6 @@ This is a lightweight solution that has its own 
limitations.
 
 More details on using the local virtualenv are available in the 
`LOCAL_VIRTUALENV.rst <LOCAL_VIRTUALENV.rst>`_.
 
-Running static checks in Breeze
-===============================
-
-The Breeze environment is also used to run some of the static checks as 
described in
-`STATIC_CODE_CHECKS.rst <STATIC_CODE_CHECKS.rst>`_.
-
-
-Running Tests in Breeze
-=======================
-
-As soon as you enter the Breeze environment, you can run Airflow unit tests 
via the ``pytest`` command.
-
-For supported CI test suites, types of unit tests, and other tests, see 
`TESTING.rst <TESTING.rst>`_.
 
 Breeze Command-Line Interface Reference
 =======================================
diff --git a/TESTING.rst b/TESTING.rst
index 5cc681e..7c761e8 100644
--- a/TESTING.rst
+++ b/TESTING.rst
@@ -518,7 +518,7 @@ print output generated test logs and print statements to 
the terminal immediatel
 You can modify the tests or KubernetesPodOperator and re-run them without 
re-deploying
 airflow to KinD cluster.
 
-However when you change Airflow Kubernetes executor implementation you need to 
redeploy
+However, when you change Airflow Kubernetes executor implementation you need 
to redeploy
 Airflow to the cluster.
 
 .. code-block:: bash

Reply via email to