This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 637425f21f955239ebaf17e449f9dfefe713512d
Author: Jarek Potiuk <[email protected]>
AuthorDate: Mon Dec 7 15:17:49 2020 +0100

    Adds airflow as viable docker command in official image (#12878)
    
    The change is backwards-compatible. It still allows to pass airflow
    command without "airflow" as first parameter, but you can now
    also pass "airflow" and the rest of the parameters will
    be treated as "airflow" command parameters.
    
    Documentation is updated to reflect the entrypoint behaviour
    including _CMD option in SQL connections.
    
    Part of #12762 and #12602
    
    Partially extracted from  #12766
    
    (cherry picked from commit 4d44faac77b639a19379da714bf532ceb9416a1b)
---
 docs/production-deployment.rst               | 37 +++++++++++++++++++---------
 scripts/in_container/prod/entrypoint_prod.sh | 22 +++++++++++------
 2 files changed, 39 insertions(+), 20 deletions(-)

diff --git a/docs/production-deployment.rst b/docs/production-deployment.rst
index ac6c76d..7964b34 100644
--- a/docs/production-deployment.rst
+++ b/docs/production-deployment.rst
@@ -323,20 +323,20 @@ The PROD image entrypoint works as follows:
   This is in order to accommodate the
   `OpenShift Guidelines 
<https://docs.openshift.com/enterprise/3.0/creating_images/guidelines.html>`_
 
-* If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable is passed to the container 
and it is either mysql or postgres
-  SQL alchemy connection, then the connection is checked and the script waits 
until the database is reachable.
-
-* If no ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable is set or if it is set to 
sqlite SQL alchemy connection
-  then db reset is executed.
-
-* If ``AIRFLOW__CELERY__BROKER_URL`` variable is passed and scheduler, worker 
of flower command is used then
-  the connection is checked and the script waits until the Celery broker 
database is reachable.
-
 * The ``AIRFLOW_HOME`` is set by default to ``/opt/airflow/`` - this means 
that DAGs
   are in default in the ``/opt/airflow/dags`` folder and logs are in the 
``/opt/airflow/logs``
 
 * The working directory is ``/opt/airflow`` by default.
 
+* If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable is passed to the container 
and it is either mysql or postgres
+  SQL alchemy connection, then the connection is checked and the script waits 
until the database is reachable.
+  If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD`` variable is passed to the 
container, it is evaluated as a
+  command to execute and result of this evaluation is used as 
``AIRFLOW__CORE__SQL_ALCHEMY_CONN``. The
+  ``_CMD`` variable takes precedence over the 
``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable.
+
+* If no ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable is set then SQLite 
database is created in
+  ${AIRFLOW_HOME}/airflow.db and db reset is executed.
+
 * If first argument equals to "bash" - you are dropped to a bash shell or you 
can executes bash command
   if you specify extra arguments. For example:
 
@@ -349,7 +349,6 @@ The PROD image entrypoint works as follows:
   drwxr-xr-x 2 airflow root 4096 Jun  5 18:12 dags
   drwxr-xr-x 2 airflow root 4096 Jun  5 18:12 logs
 
-
 * If first argument is equal to "python" - you are dropped in python shell or 
python commands are executed if
   you pass extra parameters. For example:
 
@@ -358,13 +357,27 @@ The PROD image entrypoint works as follows:
   > docker run -it apache/airflow:master-python3.6 python -c "print('test')"
   test
 
-* If there are any other arguments - they are passed to "airflow" command
+* If first argument equals to "airflow" - the rest of the arguments is treated 
as an airflow command
+  to execute. Example:
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:master-python3.6
+   docker run -it apache/airflow:master-python3.6 airflow webserver
+
+* If there are any other arguments - they are simply passed to the "airflow" 
command
+
+.. code-block:: bash
+
+  > docker run -it apache/airflow:master-python3.6 version
   2.0.0.dev0
 
+* If ``AIRFLOW__CELERY__BROKER_URL`` variable is passed and airflow command 
with
+  scheduler, worker of flower command is used, then the script checks the 
broker connection
+  and waits until the Celery broker database is reachable.
+  If ``AIRFLOW__CELERY__BROKER_URL_CMD`` variable is passed to the container, 
it is evaluated as a
+  command to execute and result of this evaluation is used as 
``AIRFLOW__CELERY__BROKER_URL``. The
+  ``_CMD`` variable takes precedence over the ``AIRFLOW__CELERY__BROKER_URL`` 
variable.
+
 Production image build arguments
 --------------------------------
 
diff --git a/scripts/in_container/prod/entrypoint_prod.sh 
b/scripts/in_container/prod/entrypoint_prod.sh
index 60103e7..0276e69 100755
--- a/scripts/in_container/prod/entrypoint_prod.sh
+++ b/scripts/in_container/prod/entrypoint_prod.sh
@@ -98,11 +98,11 @@ if ! whoami &> /dev/null; then
   export HOME="${AIRFLOW_USER_HOME_DIR}"
 fi
 
-
 # Warning: command environment variables (*_CMD) have priority over usual 
configuration variables
 # for configuration parameters that require sensitive information. This is the 
case for the SQL database
 # and the broker backend in this entrypoint script.
 
+
 if [[ -n "${AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD=}" ]]; then
     verify_db_connection "$(eval "$AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD")"
 else
@@ -111,6 +111,19 @@ else
     verify_db_connection "${AIRFLOW__CORE__SQL_ALCHEMY_CONN}"
 fi
 
+# The Bash and python commands still should verify the basic connections so 
they are run after the
+# DB check but before the broker check
+if [[ ${AIRFLOW_COMMAND} == "bash" ]]; then
+   shift
+   exec "/bin/bash" "${@}"
+elif [[ ${AIRFLOW_COMMAND} == "python" ]]; then
+   shift
+   exec "python" "${@}"
+elif [[ ${AIRFLOW_COMMAND} == "airflow" ]]; then
+   AIRFLOW_COMMAND="${2}"
+   shift
+fi
+
 # Note: the broker backend configuration concerns only a subset of Airflow 
components
 if [[ ${AIRFLOW_COMMAND} =~ ^(scheduler|worker|flower)$ ]]; then
     if [[ -n "${AIRFLOW__CELERY__BROKER_URL_CMD=}" ]]; then
@@ -123,13 +136,6 @@ if [[ ${AIRFLOW_COMMAND} =~ ^(scheduler|worker|flower)$ 
]]; then
     fi
 fi
 
-if [[ ${AIRFLOW_COMMAND} == "bash" ]]; then
-   shift
-   exec "/bin/bash" "${@}"
-elif [[ ${AIRFLOW_COMMAND} == "python" ]]; then
-   shift
-   exec "python" "${@}"
-fi
 
 # Run the command
 exec airflow "${@}"

Reply via email to