potiuk commented on a change in pull request #16170:
URL: https://github.com/apache/airflow/pull/16170#discussion_r642123454



##########
File path: docs/docker-stack/entrypoint.rst
##########
@@ -120,8 +120,82 @@ takes precedence over the 
:envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN` variable.
 For newer versions, the ``airflow db check`` command is used, which means that 
a ``select 1 as is_alive;`` query
 is executed. This also means that you can keep your password in secret backend.
 
+Waits for celery broker connection
+----------------------------------
+
+In case Postgres or MySQL DB is used, and one of the ``scheduler``, 
``celery``, ``worker``, or ``flower``
+commands are used the entrypoint will wait until the celery broker DB 
connection is available.
+
+The script detects backend type depending on the URL schema and assigns 
default port numbers if not specified
+in the URL. Then it loops until connection to the host/port specified can be 
established
+It tries :envvar:`CONNECTION_CHECK_MAX_COUNT` times and sleeps 
:envvar:`CONNECTION_CHECK_SLEEP_TIME` between checks.
+To disable check, set ``CONNECTION_CHECK_MAX_COUNT=0``.
+
+Supported schemes:
+
+* ``amqp(s)://``  (rabbitmq) - default port 5672
+* ``redis://``               - default port 6379
+* ``postgres://``            - default port 5432
+* ``mysql://``               - default port 3306
+
+Waiting for connection involves checking if a matching port is open.
+The host information is derived from the variables 
:envvar:`AIRFLOW__CELERY__BROKER_URL` and
+:envvar:`AIRFLOW__CELERY__BROKER_URL_CMD`. If 
:envvar:`AIRFLOW__CELERY__BROKER_URL_CMD` variable
+is passed to the container, it is evaluated as a command to execute and result 
of this evaluation is used
+as :envvar:`AIRFLOW__CELERY__BROKER_URL`. The 
:envvar:`AIRFLOW__CELERY__BROKER_URL_CMD` variable
+takes precedence over the :envvar:`AIRFLOW__CELERY__BROKER_URL` variable.
+
+.. _entrypoint:commands:
+
+Executing commands
+------------------
+
+If first argument equals to "bash" - you are dropped to a bash shell or you 
can executes bash command
+if you specify extra arguments. For example:
+
+.. code-block:: bash
+
+  docker run -it apache/airflow:2.1.0-python3.6 bash -c "ls -la"
+  total 16
+  drwxr-xr-x 4 airflow root 4096 Jun  5 18:12 .
+  drwxr-xr-x 1 root    root 4096 Jun  5 18:12 ..
+  drwxr-xr-x 2 airflow root 4096 Jun  5 18:12 dags
+  drwxr-xr-x 2 airflow root 4096 Jun  5 18:12 logs
+
+If first argument is equal to ``python`` - you are dropped in python shell or 
python commands are executed if
+you pass extra parameters. For example:
+
+.. code-block:: bash
+
+  > docker run -it apache/airflow:2.1.0-python3.6 python -c "print('test')"
+  test
+
+If first argument equals to "airflow" - the rest of the arguments is treated 
as an airflow command
+to execute. Example:
+
+.. code-block:: bash
+
+   docker run -it apache/airflow:2.1.0-python3.6 airflow webserver
+
+If there are any other arguments - they are simply passed to the "airflow" 
command
+
+.. code-block:: bash
+
+  > docker run -it apache/airflow:2.1.0-python3.6 version
+  2.1.0
+
+Additional quick test options
+-----------------------------
+
+The options below are mostly used for quick testing the image - for example 
with
+quick-start docker-compose or when you want to perform a local test with new 
packages
+added. They are not supposed to be run in the production environment as they 
add additional
+overhead for execution of additional commands. Those options in production 
should be realized
+either as maintenance operations on the database or should be embedded in teh 
custom image used

Review comment:
       ```suggestion
   either as maintenance operations on the database or should be embedded in 
the custom image used
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to