This is an automated email from the ASF dual-hosted git repository.
potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git
The following commit(s) were added to refs/heads/main by this push:
new 1958965 Update docker stack documentation to latest one (#452)
1958965 is described below
commit 19589659778a59c46e705f35ad9574b83659aa13
Author: Jarek Potiuk <[email protected]>
AuthorDate: Thu Jul 15 14:54:08 2021 +0200
Update docker stack documentation to latest one (#452)
That includes:
* 2.1.2 version of airflow
* bring back the warning about dynamic package installation
---
.../docker-stack/_sources/entrypoint.rst.txt | 57 +++++++++--------
docs-archive/docker-stack/_static/check-solid.svg | 2 +-
docs-archive/docker-stack/_static/clipboard.min.js | 8 +--
docs-archive/docker-stack/_static/copy-button.svg | 6 +-
docs-archive/docker-stack/_static/copybutton.css | 28 ++++++---
docs-archive/docker-stack/_static/copybutton.js | 5 +-
docs-archive/docker-stack/build.html | 4 +-
docs-archive/docker-stack/entrypoint.html | 72 ++++++++++++----------
docs-archive/docker-stack/genindex.html | 36 +++++------
docs-archive/docker-stack/searchindex.js | 2 +-
10 files changed, 116 insertions(+), 104 deletions(-)
diff --git a/docs-archive/docker-stack/_sources/entrypoint.rst.txt
b/docs-archive/docker-stack/_sources/entrypoint.rst.txt
index c386a67..8ac7355 100644
--- a/docs-archive/docker-stack/_sources/entrypoint.rst.txt
+++ b/docs-archive/docker-stack/_sources/entrypoint.rst.txt
@@ -94,32 +94,14 @@ You can read more about it in the "Support arbitrary user
ids" chapter in the
Waits for Airflow DB connection
-------------------------------
-In case Postgres or MySQL DB is used, the entrypoint will wait until the
airflow DB connection becomes
-available. This happens always when you use the default entrypoint.
+The entrypoint is waiting for a connection to the database independent of the
database engine. This allows us to increase
+the stability of the environment.
-The script detects backend type depending on the URL schema and assigns
default port numbers if not specified
-in the URL. Then it loops until the connection to the host/port specified can
be established
+Waiting for connection involves executing ``airflow db check`` command, which
means that a ``select 1 as is_alive;`` statement
+is executed. Then it loops until the the command will be successful.
It tries :envvar:`CONNECTION_CHECK_MAX_COUNT` times and sleeps
:envvar:`CONNECTION_CHECK_SLEEP_TIME` between checks
To disable check, set ``CONNECTION_CHECK_MAX_COUNT=0``.
-Supported schemes:
-
-* ``postgres://`` - default port 5432
-* ``mysql://`` - default port 3306
-* ``sqlite://``
-
-In case of SQLite backend, there is no connection to establish and waiting is
skipped.
-
-For older than Airflow 1.10.14, waiting for connection involves checking if a
matching port is open.
-The host information is derived from the variables
:envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN` and
-:envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD`. If
:envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD` variable
-is passed to the container, it is evaluated as a command to execute and result
of this evaluation is used
-as :envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN`. The
:envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD` variable
-takes precedence over the :envvar:`AIRFLOW__CORE__SQL_ALCHEMY_CONN` variable.
-
-For newer versions, the ``airflow db check`` command is used, which means that
a ``select 1 as is_alive;`` query
-is executed. This also means that you can keep your password in secret backend.
-
Waits for celery broker connection
----------------------------------
@@ -155,7 +137,7 @@ if you specify extra arguments. For example:
.. code-block:: bash
- docker run -it apache/airflow:2.1.0-python3.6 bash -c "ls -la"
+ docker run -it apache/airflow:2.1.2-python3.6 bash -c "ls -la"
total 16
drwxr-xr-x 4 airflow root 4096 Jun 5 18:12 .
drwxr-xr-x 1 root root 4096 Jun 5 18:12 ..
@@ -167,7 +149,7 @@ you pass extra parameters. For example:
.. code-block:: bash
- > docker run -it apache/airflow:2.1.0-python3.6 python -c "print('test')"
+ > docker run -it apache/airflow:2.1.2-python3.6 python -c "print('test')"
test
If first argument equals to "airflow" - the rest of the arguments is treated
as an airflow command
@@ -175,14 +157,14 @@ to execute. Example:
.. code-block:: bash
- docker run -it apache/airflow:2.1.0-python3.6 airflow webserver
+ docker run -it apache/airflow:2.1.2-python3.6 airflow webserver
If there are any other arguments - they are simply passed to the "airflow"
command
.. code-block:: bash
- > docker run -it apache/airflow:2.1.0-python3.6 version
- 2.1.0
+ > docker run -it apache/airflow:2.1.2-python3.6 version
+ 2.1.2
Additional quick test options
-----------------------------
@@ -262,11 +244,28 @@ and Admin role. They also forward local port ``8080`` to
the webserver port and
Installing additional requirements
..................................
+.. warning:: Installing requirements this way is a very convenient method of
running Airflow, very useful for
+ testing and debugging. However, do not be tricked by its convenience. You
should never, ever use it in
+ production environment. We have deliberately chose to make it a
development/test dependency and we print
+ a warning, whenever it is used. There is an inherent security-related
issue with using this method in
+ production. Installing the requirements this way can happen at literally
any time - when your containers
+ get restarted, when your machines in K8S cluster get restarted. In a K8S
Cluster those events can happen
+ literally any time. This opens you up to a serious vulnerability where
your production environment
+ might be brought down by a single dependency being removed from PyPI - or
even dependency of your
+ dependency. This means that you put your production service availability
in hands of 3rd-party developers.
+ At any time, any moment including weekends and holidays those 3rd party
developers might bring your
+ production Airflow instance down, without you even knowing it. This is a
serious vulnerability that
+ is similar to the infamous
+ `leftpad
<https://qz.com/646467/how-one-programmer-broke-the-internet-by-deleting-a-tiny-piece-of-code/>`_
+ problem. You can fully protect against this case by building your own,
immutable custom image, where the
+ dependencies are baked in. You have been warned.
+
Installing additional requirements can be done by specifying
``_PIP_ADDITIONAL_REQUIREMENTS`` variable.
The variable should contain a list of requirements that should be installed
additionally when entering
the containers. Note that this option slows down starting of Airflow as every
time any container starts
-it must install new packages. Therefore this option should only be used for
testing. When testing is
-finished, you should create your custom image with dependencies baked in.
+it must install new packages and it opens up huge potential security
vulnerability when used in production
+(see below). Therefore this option should only be used for testing. When
testing is finished,
+you should create your custom image with dependencies baked in.
Example:
diff --git a/docs-archive/docker-stack/_static/check-solid.svg
b/docs-archive/docker-stack/_static/check-solid.svg
index 9cbca86..92fad4b 100644
--- a/docs-archive/docker-stack/_static/check-solid.svg
+++ b/docs-archive/docker-stack/_static/check-solid.svg
@@ -1,4 +1,4 @@
-<svg xmlns="http://www.w3.org/2000/svg" class="icon icon-tabler
icon-tabler-check" width="44" height="44" viewBox="0 0 24 24"
stroke-width="1.5" stroke="currentColor" fill="none" stroke-linecap="round"
stroke-linejoin="round">
+<svg xmlns="http://www.w3.org/2000/svg" class="icon icon-tabler
icon-tabler-check" width="44" height="44" viewBox="0 0 24 24" stroke-width="2"
stroke="#22863a" fill="none" stroke-linecap="round" stroke-linejoin="round">
<path stroke="none" d="M0 0h24v24H0z" fill="none"/>
<path d="M5 12l5 5l10 -10" />
</svg>
diff --git a/docs-archive/docker-stack/_static/clipboard.min.js
b/docs-archive/docker-stack/_static/clipboard.min.js
index 02c549e..54b3c46 100644
--- a/docs-archive/docker-stack/_static/clipboard.min.js
+++ b/docs-archive/docker-stack/_static/clipboard.min.js
@@ -1,7 +1,7 @@
/*!
- * clipboard.js v2.0.4
- * https://zenorocha.github.io/clipboard.js
- *
+ * clipboard.js v2.0.8
+ * https://clipboardjs.com/
+ *
* Licensed MIT © Zeno Rocha
*/
-!function(t,e){"object"==typeof exports&&"object"==typeof
module?module.exports=e():"function"==typeof
define&&define.amd?define([],e):"object"==typeof
exports?exports.ClipboardJS=e():t.ClipboardJS=e()}(this,function(){return
function(n){var o={};function r(t){if(o[t])return o[t].exports;var
e=o[t]={i:t,l:!1,exports:{}};return
n[t].call(e.exports,e,e.exports,r),e.l=!0,e.exports}return
r.m=n,r.c=o,r.d=function(t,e,n){r.o(t,e)||Object.defineProperty(t,e,{enumerable:!0,get:n})},r.r=function
[...]
\ No newline at end of file
+!function(t,e){"object"==typeof exports&&"object"==typeof
module?module.exports=e():"function"==typeof
define&&define.amd?define([],e):"object"==typeof
exports?exports.ClipboardJS=e():t.ClipboardJS=e()}(this,function(){return
n={686:function(t,e,n){"use strict";n.d(e,{default:function(){return o}});var
e=n(279),i=n.n(e),e=n(370),u=n.n(e),e=n(817),c=n.n(e);function a(t){try{return
document.execCommand(t)}catch(t){return}}var f=function(t){t=c()(t);return
a("cut"),t};var l=function(t){var [...]
\ No newline at end of file
diff --git a/docs-archive/docker-stack/_static/copy-button.svg
b/docs-archive/docker-stack/_static/copy-button.svg
index 62e0e0d..b888a20 100644
--- a/docs-archive/docker-stack/_static/copy-button.svg
+++ b/docs-archive/docker-stack/_static/copy-button.svg
@@ -1,5 +1,5 @@
-<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24
24" stroke-width="1.5" stroke="#607D8B" fill="none" stroke-linecap="round"
stroke-linejoin="round">
+<svg xmlns="http://www.w3.org/2000/svg" class="icon icon-tabler
icon-tabler-clipboard" width="44" height="44" viewBox="0 0 24 24"
stroke-width="1.5" stroke="#2c3e50" fill="none" stroke-linecap="round"
stroke-linejoin="round">
<path stroke="none" d="M0 0h24v24H0z" fill="none"/>
- <rect x="8" y="8" width="12" height="12" rx="2" />
- <path d="M16 8v-2a2 2 0 0 0 -2 -2h-8a2 2 0 0 0 -2 2v8a2 2 0 0 0 2 2h2" />
+ <path d="M9 5h-2a2 2 0 0 0 -2 2v12a2 2 0 0 0 2 2h10a2 2 0 0 0 2 -2v-12a2 2 0
0 0 -2 -2h-2" />
+ <rect x="9" y="3" width="6" height="4" rx="2" />
</svg>
diff --git a/docs-archive/docker-stack/_static/copybutton.css
b/docs-archive/docker-stack/_static/copybutton.css
index 3a863dd..5d29149 100644
--- a/docs-archive/docker-stack/_static/copybutton.css
+++ b/docs-archive/docker-stack/_static/copybutton.css
@@ -1,20 +1,29 @@
/* Copy buttons */
button.copybtn {
position: absolute;
+ display: flex;
top: .3em;
right: .5em;
- width: 1.7rem;
- height: 1.7rem;
+ width: 1.7em;
+ height: 1.7em;
opacity: 0;
- transition: opacity 0.3s, border .3s;
+ transition: opacity 0.3s, border .3s, background-color .3s;
user-select: none;
padding: 0;
border: none;
outline: none;
+ border-radius: 0.4em;
+ border: #e1e1e1 1px solid;
+ background-color: rgb(245, 245, 245);
+}
+
+button.copybtn.success {
+ border-color: #22863a;
}
button.copybtn img {
width: 100%;
+ padding: .2em;
}
div.highlight {
@@ -22,11 +31,15 @@ div.highlight {
}
.highlight:hover button.copybtn {
- opacity: .7;
+ opacity: 1;
}
.highlight button.copybtn:hover {
- opacity: 1;
+ background-color: rgb(235, 235, 235);
+}
+
+.highlight button.copybtn:active {
+ background-color: rgb(187, 187, 187);
}
/**
@@ -46,11 +59,10 @@ div.highlight {
visibility: hidden;
position: absolute;
content: attr(data-tooltip);
- padding: 2px;
- top: 0;
+ padding: .2em;
+ font-size: .8em;
left: -.2em;
background: grey;
- font-size: 1rem;
color: white;
white-space: nowrap;
z-index: 2;
diff --git a/docs-archive/docker-stack/_static/copybutton.js
b/docs-archive/docker-stack/_static/copybutton.js
index c4a9f92..482bda0 100644
--- a/docs-archive/docker-stack/_static/copybutton.js
+++ b/docs-archive/docker-stack/_static/copybutton.js
@@ -81,7 +81,9 @@ const clearSelection = () => {
// Changes tooltip text for two seconds, then changes it back
const temporarilyChangeTooltip = (el, oldText, newText) => {
el.setAttribute('data-tooltip', newText)
+ el.classList.add('success')
setTimeout(() => el.setAttribute('data-tooltip', oldText), 2000)
+ setTimeout(() => el.classList.remove('success'), 2000)
}
// Changes the copy button icon for two seconds, then changes it back
@@ -104,10 +106,9 @@ const addCopyButtonToCodeCells = () => {
codeCells.forEach((codeCell, index) => {
const id = codeCellId(index)
codeCell.setAttribute('id', id)
- const pre_bg = getComputedStyle(codeCell).backgroundColor;
const clipboardButton = id =>
- `<button class="copybtn o-tooltip--left" style="background-color:
${pre_bg}" data-tooltip="${messages[locale]['copy']}"
data-clipboard-target="#${id}">
+ `<button class="copybtn o-tooltip--left"
data-tooltip="${messages[locale]['copy']}" data-clipboard-target="#${id}">
<img src="${path_static}copy-button.svg"
alt="${messages[locale]['copy_to_clipboard']}">
</button>`
codeCell.insertAdjacentHTML('afterend', clipboardButton(id))
diff --git a/docs-archive/docker-stack/build.html
b/docs-archive/docker-stack/build.html
index bf89c80..1dff47c 100644
--- a/docs-archive/docker-stack/build.html
+++ b/docs-archive/docker-stack/build.html
@@ -1283,7 +1283,7 @@ to provide this library from you repository if you want
to build Airflow image i
<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>rm docker-context-files/*.whl
docker-context-files/*.tar.gz docker-context-files/*.txt <span
class="o">||</span> <span class="nb">true</span>
curl -Lo <span
class="s2">"docker-context-files/constraints-3.7.txt"</span> <span
class="se">\</span>
-
https://raw.githubusercontent.com/apache/airflow/constraints-2.0.2/constraints-3.7.txt
+
https://raw.githubusercontent.com/apache/airflow/constraints-2.1.2/constraints-3.7.txt
<span class="c1"># For Airflow pre 2.1 you need to use PIP 20.2.4 to
install/download Airflow packages.</span>
pip install <span class="nv">pip</span><span class="o">==</span><span
class="m">20</span>.2.4
@@ -1323,7 +1323,7 @@ to the below:</p>
<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>docker build . <span class="se">\</span>
--build-arg <span class="nv">PYTHON_BASE_IMAGE</span><span
class="o">=</span><span class="s2">"python:3.7-slim-buster"</span>
<span class="se">\</span>
--build-arg <span class="nv">AIRFLOW_INSTALLATION_METHOD</span><span
class="o">=</span><span class="s2">"apache-airflow"</span> <span
class="se">\</span>
- --build-arg <span class="nv">AIRFLOW_VERSION</span><span
class="o">=</span><span class="s2">"2.0.2"</span> <span
class="se">\</span>
+ --build-arg <span class="nv">AIRFLOW_VERSION</span><span
class="o">=</span><span class="s2">"2.1.2"</span> <span
class="se">\</span>
--build-arg <span class="nv">INSTALL_MYSQL_CLIENT</span><span
class="o">=</span><span class="s2">"false"</span> <span
class="se">\</span>
--build-arg <span class="nv">AIRFLOW_PRE_CACHED_PIP_PACKAGES</span><span
class="o">=</span><span class="s2">"false"</span> <span
class="se">\</span>
--build-arg <span class="nv">INSTALL_FROM_DOCKER_CONTEXT_FILES</span><span
class="o">=</span><span class="s2">"true"</span> <span
class="se">\</span>
diff --git a/docs-archive/docker-stack/entrypoint.html
b/docs-archive/docker-stack/entrypoint.html
index abe7398..1350a79 100644
--- a/docs-archive/docker-stack/entrypoint.html
+++ b/docs-archive/docker-stack/entrypoint.html
@@ -643,27 +643,12 @@ that need group access will also be writable for the
group. This can be done for
</div>
<div class="section" id="waits-for-airflow-db-connection">
<h2>Waits for Airflow DB connection<a class="headerlink"
href="#waits-for-airflow-db-connection" title="Permalink to this
headline">¶</a></h2>
-<p>In case Postgres or MySQL DB is used, the entrypoint will wait until the
airflow DB connection becomes
-available. This happens always when you use the default entrypoint.</p>
-<p>The script detects backend type depending on the URL schema and assigns
default port numbers if not specified
-in the URL. Then it loops until the connection to the host/port specified can
be established
+<p>The entrypoint is waiting for a connection to the database independent of
the database engine. This allows us to increase
+the stability of the environment.</p>
+<p>Waiting for connection involves executing <code class="docutils literal
notranslate"><span class="pre">airflow</span> <span class="pre">db</span> <span
class="pre">check</span></code> command, which means that a <code
class="docutils literal notranslate"><span class="pre">select</span> <span
class="pre">1</span> <span class="pre">as</span> <span
class="pre">is_alive;</span></code> statement
+is executed. Then it loops until the the command will be successful.
It tries <span class="target" id="index-0"></span><code class="xref std
std-envvar docutils literal notranslate"><span
class="pre">CONNECTION_CHECK_MAX_COUNT</span></code> times and sleeps <span
class="target" id="index-1"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">CONNECTION_CHECK_SLEEP_TIME</span></code> between checks
To disable check, set <code class="docutils literal notranslate"><span
class="pre">CONNECTION_CHECK_MAX_COUNT=0</span></code>.</p>
-<p>Supported schemes:</p>
-<ul class="simple">
-<li><p><code class="docutils literal notranslate"><span
class="pre">postgres://</span></code> - default port 5432</p></li>
-<li><p><code class="docutils literal notranslate"><span
class="pre">mysql://</span></code> - default port 3306</p></li>
-<li><p><code class="docutils literal notranslate"><span
class="pre">sqlite://</span></code></p></li>
-</ul>
-<p>In case of SQLite backend, there is no connection to establish and waiting
is skipped.</p>
-<p>For older than Airflow 1.10.14, waiting for connection involves checking if
a matching port is open.
-The host information is derived from the variables <span class="target"
id="index-2"></span><code class="xref std std-envvar docutils literal
notranslate"><span class="pre">AIRFLOW__CORE__SQL_ALCHEMY_CONN</span></code> and
-<span class="target" id="index-3"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD</span></code>. If <span
class="target" id="index-4"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD</span></code> variable
-is passed to the container, it is evaluated as a command to execute and result
of this evaluation is used
-as <span class="target" id="index-5"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">AIRFLOW__CORE__SQL_ALCHEMY_CONN</span></code>. The <span
class="target" id="index-6"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD</span></code> variable
-takes precedence over the <span class="target" id="index-7"></span><code
class="xref std std-envvar docutils literal notranslate"><span
class="pre">AIRFLOW__CORE__SQL_ALCHEMY_CONN</span></code> variable.</p>
-<p>For newer versions, the <code class="docutils literal notranslate"><span
class="pre">airflow</span> <span class="pre">db</span> <span
class="pre">check</span></code> command is used, which means that a <code
class="docutils literal notranslate"><span class="pre">select</span> <span
class="pre">1</span> <span class="pre">as</span> <span
class="pre">is_alive;</span></code> query
-is executed. This also means that you can keep your password in secret
backend.</p>
</div>
<div class="section" id="waits-for-celery-broker-connection">
<h2>Waits for celery broker connection<a class="headerlink"
href="#waits-for-celery-broker-connection" title="Permalink to this
headline">¶</a></h2>
@@ -671,7 +656,7 @@ is executed. This also means that you can keep your
password in secret backend.<
commands are used the entrypoint will wait until the celery broker DB
connection is available.</p>
<p>The script detects backend type depending on the URL schema and assigns
default port numbers if not specified
in the URL. Then it loops until connection to the host/port specified can be
established
-It tries <span class="target" id="index-8"></span><code class="xref std
std-envvar docutils literal notranslate"><span
class="pre">CONNECTION_CHECK_MAX_COUNT</span></code> times and sleeps <span
class="target" id="index-9"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">CONNECTION_CHECK_SLEEP_TIME</span></code> between checks.
+It tries <span class="target" id="index-2"></span><code class="xref std
std-envvar docutils literal notranslate"><span
class="pre">CONNECTION_CHECK_MAX_COUNT</span></code> times and sleeps <span
class="target" id="index-3"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">CONNECTION_CHECK_SLEEP_TIME</span></code> between checks.
To disable check, set <code class="docutils literal notranslate"><span
class="pre">CONNECTION_CHECK_MAX_COUNT=0</span></code>.</p>
<p>Supported schemes:</p>
<ul class="simple">
@@ -681,17 +666,17 @@ To disable check, set <code class="docutils literal
notranslate"><span class="pr
<li><p><code class="docutils literal notranslate"><span
class="pre">mysql://</span></code> - default port 3306</p></li>
</ul>
<p>Waiting for connection involves checking if a matching port is open.
-The host information is derived from the variables <span class="target"
id="index-10"></span><code class="xref std std-envvar docutils literal
notranslate"><span class="pre">AIRFLOW__CELERY__BROKER_URL</span></code> and
-<span class="target" id="index-11"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL_CMD</span></code>. If <span
class="target" id="index-12"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL_CMD</span></code> variable
+The host information is derived from the variables <span class="target"
id="index-4"></span><code class="xref std std-envvar docutils literal
notranslate"><span class="pre">AIRFLOW__CELERY__BROKER_URL</span></code> and
+<span class="target" id="index-5"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL_CMD</span></code>. If <span
class="target" id="index-6"></span><code class="xref std std-envvar docutils
literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL_CMD</span></code> variable
is passed to the container, it is evaluated as a command to execute and result
of this evaluation is used
-as <span class="target" id="index-13"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL</span></code>. The <span class="target"
id="index-14"></span><code class="xref std std-envvar docutils literal
notranslate"><span class="pre">AIRFLOW__CELERY__BROKER_URL_CMD</span></code>
variable
-takes precedence over the <span class="target" id="index-15"></span><code
class="xref std std-envvar docutils literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL</span></code> variable.</p>
+as <span class="target" id="index-7"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL</span></code>. The <span class="target"
id="index-8"></span><code class="xref std std-envvar docutils literal
notranslate"><span class="pre">AIRFLOW__CELERY__BROKER_URL_CMD</span></code>
variable
+takes precedence over the <span class="target" id="index-9"></span><code
class="xref std std-envvar docutils literal notranslate"><span
class="pre">AIRFLOW__CELERY__BROKER_URL</span></code> variable.</p>
</div>
<div class="section" id="executing-commands">
<span id="entrypoint-commands"></span><h2>Executing commands<a
class="headerlink" href="#executing-commands" title="Permalink to this
headline">¶</a></h2>
<p>If first argument equals to “bash” - you are dropped to a bash shell or you
can executes bash command
if you specify extra arguments. For example:</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>docker run -it
apache/airflow:2.1.0-python3.6 bash -c <span class="s2">"ls
-la"</span>
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>docker run -it
apache/airflow:2.1.2-python3.6 bash -c <span class="s2">"ls
-la"</span>
total <span class="m">16</span>
drwxr-xr-x <span class="m">4</span> airflow root <span class="m">4096</span>
Jun <span class="m">5</span> <span class="m">18</span>:12 .
drwxr-xr-x <span class="m">1</span> root root <span class="m">4096</span>
Jun <span class="m">5</span> <span class="m">18</span>:12 ..
@@ -701,18 +686,18 @@ drwxr-xr-x <span class="m">2</span> airflow root <span
class="m">4096</span> Jun
</div>
<p>If first argument is equal to <code class="docutils literal
notranslate"><span class="pre">python</span></code> - you are dropped in python
shell or python commands are executed if
you pass extra parameters. For example:</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>> docker run -it
apache/airflow:2.1.0-python3.6 python -c <span
class="s2">"print('test')"</span>
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>> docker run -it
apache/airflow:2.1.2-python3.6 python -c <span
class="s2">"print('test')"</span>
<span class="nb">test</span>
</pre></div>
</div>
<p>If first argument equals to “airflow” - the rest of the arguments is
treated as an airflow command
to execute. Example:</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>docker run -it
apache/airflow:2.1.0-python3.6 airflow webserver
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>docker run -it
apache/airflow:2.1.2-python3.6 airflow webserver
</pre></div>
</div>
<p>If there are any other arguments - they are simply passed to the “airflow”
command</p>
-<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>> docker run -it
apache/airflow:2.1.0-python3.6 version
-<span class="m">2</span>.1.0
+<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>> docker run -it
apache/airflow:2.1.2-python3.6 version
+<span class="m">2</span>.1.2
</pre></div>
</div>
</div>
@@ -726,7 +711,7 @@ either as maintenance operations on the database or should
be embedded in the cu
(when you want to add new packages).</p>
<div class="section" id="upgrading-airflow-db">
<h3>Upgrading Airflow DB<a class="headerlink" href="#upgrading-airflow-db"
title="Permalink to this headline">¶</a></h3>
-<p>If you set <span class="target" id="index-16"></span><code class="xref std
std-envvar docutils literal notranslate"><span
class="pre">_AIRFLOW_DB_UPGRADE</span></code> variable to a non-empty value,
the entrypoint will run
+<p>If you set <span class="target" id="index-10"></span><code class="xref std
std-envvar docutils literal notranslate"><span
class="pre">_AIRFLOW_DB_UPGRADE</span></code> variable to a non-empty value,
the entrypoint will run
the <code class="docutils literal notranslate"><span
class="pre">airflow</span> <span class="pre">db</span> <span
class="pre">upgrade</span></code> command right after verifying the connection.
You can also use this
when you are running airflow with internal SQLite database (default) to
upgrade the db and create
admin users at entrypoint, so that you can start the webserver immediately.
Note - using SQLite is
@@ -736,10 +721,10 @@ comes to concurrency.</p>
<div class="section" id="creating-admin-user">
<h3>Creating admin user<a class="headerlink" href="#creating-admin-user"
title="Permalink to this headline">¶</a></h3>
<p>The entrypoint can also create webserver user automatically when you enter
it. you need to set
-<span class="target" id="index-17"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">_AIRFLOW_WWW_USER_CREATE</span></code> to a non-empty value in
order to do that. This is not intended for
+<span class="target" id="index-11"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">_AIRFLOW_WWW_USER_CREATE</span></code> to a non-empty value in
order to do that. This is not intended for
production, it is only useful if you would like to run a quick test with the
production image.
You need to pass at least password to create such user via <code
class="docutils literal notranslate"><span
class="pre">_AIRFLOW_WWW_USER_PASSWORD</span></code> or
-<span class="target" id="index-18"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">_AIRFLOW_WWW_USER_PASSWORD_CMD</span></code> similarly like for
other <code class="docutils literal notranslate"><span
class="pre">*_CMD</span></code> variables, the content of
+<span class="target" id="index-12"></span><code class="xref std std-envvar
docutils literal notranslate"><span
class="pre">_AIRFLOW_WWW_USER_PASSWORD_CMD</span></code> similarly like for
other <code class="docutils literal notranslate"><span
class="pre">*_CMD</span></code> variables, the content of
the <code class="docutils literal notranslate"><span
class="pre">*_CMD</span></code> will be evaluated as shell command and it’s
output will be set as password.</p>
<p>User creation will fail if none of the <code class="docutils literal
notranslate"><span class="pre">PASSWORD</span></code> variables are set - there
is no default for
password for security reasons.</p>
@@ -805,11 +790,30 @@ and Admin role. They also forward local port <code
class="docutils literal notra
</div>
<div class="section" id="installing-additional-requirements">
<h3>Installing additional requirements<a class="headerlink"
href="#installing-additional-requirements" title="Permalink to this
headline">¶</a></h3>
+<div class="admonition warning">
+<p class="admonition-title">Warning</p>
+<p>Installing requirements this way is a very convenient method of running
Airflow, very useful for
+testing and debugging. However, do not be tricked by its convenience. You
should never, ever use it in
+production environment. We have deliberately chose to make it a
development/test dependency and we print
+a warning, whenever it is used. There is an inherent security-related issue
with using this method in
+production. Installing the requirements this way can happen at literally any
time - when your containers
+get restarted, when your machines in K8S cluster get restarted. In a K8S
Cluster those events can happen
+literally any time. This opens you up to a serious vulnerability where your
production environment
+might be brought down by a single dependency being removed from PyPI - or even
dependency of your
+dependency. This means that you put your production service availability in
hands of 3rd-party developers.
+At any time, any moment including weekends and holidays those 3rd party
developers might bring your
+production Airflow instance down, without you even knowing it. This is a
serious vulnerability that
+is similar to the infamous
+<a class="reference external"
href="https://qz.com/646467/how-one-programmer-broke-the-internet-by-deleting-a-tiny-piece-of-code/">leftpad</a>
+problem. You can fully protect against this case by building your own,
immutable custom image, where the
+dependencies are baked in. You have been warned.</p>
+</div>
<p>Installing additional requirements can be done by specifying <code
class="docutils literal notranslate"><span
class="pre">_PIP_ADDITIONAL_REQUIREMENTS</span></code> variable.
The variable should contain a list of requirements that should be installed
additionally when entering
the containers. Note that this option slows down starting of Airflow as every
time any container starts
-it must install new packages. Therefore this option should only be used for
testing. When testing is
-finished, you should create your custom image with dependencies baked in.</p>
+it must install new packages and it opens up huge potential security
vulnerability when used in production
+(see below). Therefore this option should only be used for testing. When
testing is finished,
+you should create your custom image with dependencies baked in.</p>
<p>Example:</p>
<div class="highlight-bash notranslate"><div
class="highlight"><pre><span></span>docker run -it -p <span
class="m">8080</span>:8080 <span class="se">\</span>
--env <span class="s2">"_PIP_ADDITIONAL_REQUIREMENTS=lxml==4.6.3
charset-normalizer==1.4.1"</span> <span class="se">\</span>
diff --git a/docs-archive/docker-stack/genindex.html
b/docs-archive/docker-stack/genindex.html
index ff5cb63..c3fecc8 100644
--- a/docs-archive/docker-stack/genindex.html
+++ b/docs-archive/docker-stack/genindex.html
@@ -567,13 +567,13 @@
<h2 id="_">_</h2>
<table style="width: 100%" class="indextable genindextable"><tr>
<td style="width: 33%; vertical-align: top;"><ul>
- <li><a href="entrypoint.html#index-16">_AIRFLOW_DB_UPGRADE</a>
+ <li><a href="entrypoint.html#index-10">_AIRFLOW_DB_UPGRADE</a>
</li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
- <li><a href="entrypoint.html#index-17">_AIRFLOW_WWW_USER_CREATE</a>
+ <li><a href="entrypoint.html#index-11">_AIRFLOW_WWW_USER_CREATE</a>
</li>
- <li><a href="entrypoint.html#index-18">_AIRFLOW_WWW_USER_PASSWORD_CMD</a>
+ <li><a href="entrypoint.html#index-12">_AIRFLOW_WWW_USER_PASSWORD_CMD</a>
</li>
</ul></td>
</tr></table>
@@ -581,15 +581,13 @@
<h2 id="A">A</h2>
<table style="width: 100%" class="indextable genindextable"><tr>
<td style="width: 33%; vertical-align: top;"><ul>
- <li><a href="entrypoint.html#index-10">AIRFLOW__CELERY__BROKER_URL</a>,
<a href="entrypoint.html#index-13">[1]</a>, <a
href="entrypoint.html#index-15">[2]</a>
+ <li><a href="entrypoint.html#index-4">AIRFLOW__CELERY__BROKER_URL</a>,
<a href="entrypoint.html#index-7">[1]</a>, <a
href="entrypoint.html#index-9">[2]</a>
</li>
- <li><a
href="entrypoint.html#index-11">AIRFLOW__CELERY__BROKER_URL_CMD</a>, <a
href="entrypoint.html#index-12">[1]</a>, <a
href="entrypoint.html#index-14">[2]</a>
+ <li><a
href="entrypoint.html#index-5">AIRFLOW__CELERY__BROKER_URL_CMD</a>, <a
href="entrypoint.html#index-6">[1]</a>, <a
href="entrypoint.html#index-8">[2]</a>
</li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
- <li><a
href="entrypoint.html#index-2">AIRFLOW__CORE__SQL_ALCHEMY_CONN</a>, <a
href="entrypoint.html#index-5">[1]</a>, <a
href="entrypoint.html#index-7">[2]</a>, <a href="index.html#index-1">[3]</a>
-</li>
- <li><a
href="entrypoint.html#index-3">AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD</a>, <a
href="entrypoint.html#index-4">[1]</a>, <a
href="entrypoint.html#index-6">[2]</a>
+ <li><a href="index.html#index-1">AIRFLOW__CORE__SQL_ALCHEMY_CONN</a>
</li>
<li><a href="index.html#index-0">AIRFLOW_HOME</a>
</li>
@@ -599,11 +597,11 @@
<h2 id="C">C</h2>
<table style="width: 100%" class="indextable genindextable"><tr>
<td style="width: 33%; vertical-align: top;"><ul>
- <li><a href="entrypoint.html#index-0">CONNECTION_CHECK_MAX_COUNT</a>, <a
href="entrypoint.html#index-8">[1]</a>
+ <li><a href="entrypoint.html#index-0">CONNECTION_CHECK_MAX_COUNT</a>, <a
href="entrypoint.html#index-2">[1]</a>
</li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
- <li><a href="entrypoint.html#index-1">CONNECTION_CHECK_SLEEP_TIME</a>,
<a href="entrypoint.html#index-9">[1]</a>
+ <li><a href="entrypoint.html#index-1">CONNECTION_CHECK_SLEEP_TIME</a>,
<a href="entrypoint.html#index-3">[1]</a>
</li>
</ul></td>
</tr></table>
@@ -615,25 +613,23 @@
environment variable
<ul>
- <li><a href="entrypoint.html#index-16">_AIRFLOW_DB_UPGRADE</a>
-</li>
- <li><a href="entrypoint.html#index-17">_AIRFLOW_WWW_USER_CREATE</a>
+ <li><a href="entrypoint.html#index-10">_AIRFLOW_DB_UPGRADE</a>
</li>
- <li><a
href="entrypoint.html#index-18">_AIRFLOW_WWW_USER_PASSWORD_CMD</a>
+ <li><a href="entrypoint.html#index-11">_AIRFLOW_WWW_USER_CREATE</a>
</li>
- <li><a
href="entrypoint.html#index-10">AIRFLOW__CELERY__BROKER_URL</a>, <a
href="entrypoint.html#index-13">[1]</a>, <a
href="entrypoint.html#index-15">[2]</a>
+ <li><a
href="entrypoint.html#index-12">_AIRFLOW_WWW_USER_PASSWORD_CMD</a>
</li>
- <li><a
href="entrypoint.html#index-11">AIRFLOW__CELERY__BROKER_URL_CMD</a>, <a
href="entrypoint.html#index-12">[1]</a>, <a
href="entrypoint.html#index-14">[2]</a>
+ <li><a href="entrypoint.html#index-4">AIRFLOW__CELERY__BROKER_URL</a>,
<a href="entrypoint.html#index-7">[1]</a>, <a
href="entrypoint.html#index-9">[2]</a>
</li>
- <li><a
href="entrypoint.html#index-2">AIRFLOW__CORE__SQL_ALCHEMY_CONN</a>, <a
href="entrypoint.html#index-5">[1]</a>, <a
href="entrypoint.html#index-7">[2]</a>, <a href="index.html#index-1">[3]</a>
+ <li><a
href="entrypoint.html#index-5">AIRFLOW__CELERY__BROKER_URL_CMD</a>, <a
href="entrypoint.html#index-6">[1]</a>, <a
href="entrypoint.html#index-8">[2]</a>
</li>
- <li><a
href="entrypoint.html#index-3">AIRFLOW__CORE__SQL_ALCHEMY_CONN_CMD</a>, <a
href="entrypoint.html#index-4">[1]</a>, <a
href="entrypoint.html#index-6">[2]</a>
+ <li><a href="index.html#index-1">AIRFLOW__CORE__SQL_ALCHEMY_CONN</a>
</li>
<li><a href="index.html#index-0">AIRFLOW_HOME</a>
</li>
- <li><a href="entrypoint.html#index-0">CONNECTION_CHECK_MAX_COUNT</a>,
<a href="entrypoint.html#index-8">[1]</a>
+ <li><a href="entrypoint.html#index-0">CONNECTION_CHECK_MAX_COUNT</a>,
<a href="entrypoint.html#index-2">[1]</a>
</li>
- <li><a href="entrypoint.html#index-1">CONNECTION_CHECK_SLEEP_TIME</a>,
<a href="entrypoint.html#index-9">[1]</a>
+ <li><a href="entrypoint.html#index-1">CONNECTION_CHECK_SLEEP_TIME</a>,
<a href="entrypoint.html#index-3">[1]</a>
</li>
</ul></li>
</ul></td>
diff --git a/docs-archive/docker-stack/searchindex.js
b/docs-archive/docker-stack/searchindex.js
index 4ef34de..f80a14e 100644
--- a/docs-archive/docker-stack/searchindex.js
+++ b/docs-archive/docker-stack/searchindex.js
@@ -1 +1 @@
-Search.setIndex({docnames:["build","build-arg-ref","entrypoint","index","recipes"],envversion:{"sphinx.domains.c":2,"sphinx.domains.changeset":1,"sphinx.domains.citation":1,"sphinx.domains.cpp":3,"sphinx.domains.index":1,"sphinx.domains.javascript":2,"sphinx.domains.math":2,"sphinx.domains.python":2,"sphinx.domains.rst":2,"sphinx.domains.std":1,"sphinx.ext.intersphinx":1,"sphinx.ext.viewcode":1,sphinx:56},filenames:["build.rst","build-arg-ref.rst","entrypoint.rst","index.rst","recipes.rs
[...]
\ No newline at end of file
+Search.setIndex({docnames:["build","build-arg-ref","entrypoint","index","recipes"],envversion:{"sphinx.domains.c":2,"sphinx.domains.changeset":1,"sphinx.domains.citation":1,"sphinx.domains.cpp":3,"sphinx.domains.index":1,"sphinx.domains.javascript":2,"sphinx.domains.math":2,"sphinx.domains.python":2,"sphinx.domains.rst":2,"sphinx.domains.std":1,"sphinx.ext.intersphinx":1,"sphinx.ext.viewcode":1,sphinx:56},filenames:["build.rst","build-arg-ref.rst","entrypoint.rst","index.rst","recipes.rs
[...]
\ No newline at end of file