This is an automated email from the ASF dual-hosted git repository.
potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/main by this push:
new 2532f727b15 docs: grammar fixes (#49432)
2532f727b15 is described below
commit 2532f727b1501297325b718095c4765d55572de8
Author: Kalyan R <[email protected]>
AuthorDate: Fri Apr 18 16:41:28 2025 +0530
docs: grammar fixes (#49432)
* proof read installation's index page
* proof read installation's index page and dependencies
* pf supported-versions.rst
* pf install from source
* pf install from pypi
* remove extra colon in upgrading to airflow3
* Capitalize the Airflow
---
PROVIDERS.rst | 4 +-
.../cluster-policies.rst | 2 +-
.../docs/administration-and-deployment/index.rst | 2 +-
.../advanced-logging-configuration.rst | 2 +-
.../modules_management.rst | 2 +-
.../docs/administration-and-deployment/plugins.rst | 4 +-
.../docs/authoring-and-scheduling/connections.rst | 2 +-
.../docs/authoring-and-scheduling/index.rst | 2 +-
airflow-core/docs/best-practices.rst | 4 +-
airflow-core/docs/configurations-ref.rst | 2 +-
airflow-core/docs/core-concepts/backfill.rst | 2 +-
airflow-core/docs/core-concepts/operators.rst | 2 +-
airflow-core/docs/core-concepts/overview.rst | 2 +-
airflow-core/docs/extra-packages-ref.rst | 14 +++---
airflow-core/docs/howto/email-config.rst | 2 +-
airflow-core/docs/howto/export-more-env-vars.rst | 2 +-
airflow-core/docs/howto/listener-plugin.rst | 6 +--
airflow-core/docs/howto/set-config.rst | 2 +-
airflow-core/docs/howto/usage-cli.rst | 2 +-
airflow-core/docs/installation/dependencies.rst | 4 +-
airflow-core/docs/installation/index.rst | 50 +++++++++++-----------
.../docs/installation/installing-from-pypi.rst | 40 ++++++++---------
.../docs/installation/installing-from-sources.rst | 12 +++---
airflow-core/docs/installation/prerequisites.rst | 11 +++--
.../docs/installation/supported-versions.rst | 10 ++---
.../docs/installation/upgrading_to_airflow3.rst | 4 +-
airflow-core/docs/templates-ref.rst | 2 +-
27 files changed, 94 insertions(+), 99 deletions(-)
diff --git a/PROVIDERS.rst b/PROVIDERS.rst
index 1d1386cceb4..ced8f5fb1b4 100644
--- a/PROVIDERS.rst
+++ b/PROVIDERS.rst
@@ -57,7 +57,7 @@ releasing new versions of the providers. This means that the
code changes in the
reviewed by Airflow committers and merged when they are accepted by them. Also
we must have sufficient
test coverage and documentation that allow us to maintain the providers, and
our users to use them.
-The providers - their latest version in "main" branch of airflow repository -
are installed and tested together
+The providers - their latest version in "main" branch of Airflow repository -
are installed and tested together
with other community providers and one of the key properties of the community
providers is that the latest
version of providers contribute their dependencies to constraints of Airflow,
published when Airflow Core is
released. This means that when users are using constraints published by
Airflow, they can install all
@@ -92,7 +92,7 @@ Accepting new community providers
---------------------------------
Accepting new community providers should be a deliberate process that requires
``[DISCUSSION]``
-followed by ``[VOTE]`` thread at the airflow `devlist
<https://airflow.apache.org/community/#mailing-list>`_.
+followed by ``[VOTE]`` thread at the Airflow `devlist
<https://airflow.apache.org/community/#mailing-list>`_.
In case the provider is integration with an open-source software rather than
service we can relax the vote
procedure a bit. Particularly if the open-source software is an Apache
Software Foundation,
diff --git
a/airflow-core/docs/administration-and-deployment/cluster-policies.rst
b/airflow-core/docs/administration-and-deployment/cluster-policies.rst
index e0b71eef7a3..e43f13e7450 100644
--- a/airflow-core/docs/administration-and-deployment/cluster-policies.rst
+++ b/airflow-core/docs/administration-and-deployment/cluster-policies.rst
@@ -152,7 +152,7 @@ Here's an example of enforcing a maximum timeout policy on
every task:
:start-after: [START example_task_cluster_policy]
:end-before: [END example_task_cluster_policy]
-You could also implement to protect against common errors, rather than as
technical security controls. For example, don't run tasks without airflow
owners:
+You could also implement to protect against common errors, rather than as
technical security controls. For example, don't run tasks without Airflow
owners:
.. literalinclude:: /../tests/unit/cluster_policies/__init__.py
:language: python
diff --git a/airflow-core/docs/administration-and-deployment/index.rst
b/airflow-core/docs/administration-and-deployment/index.rst
index ec39a526a77..720f5f695eb 100644
--- a/airflow-core/docs/administration-and-deployment/index.rst
+++ b/airflow-core/docs/administration-and-deployment/index.rst
@@ -18,7 +18,7 @@
Administration and Deployment
=====================================
-This section contains information about deploying dags into production and the
administration of airflow deployments.
+This section contains information about deploying dags into production and the
administration of Airflow deployments.
.. toctree::
:maxdepth: 2
diff --git
a/airflow-core/docs/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst
b/airflow-core/docs/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst
index 2cfc44e7255..342de5e016a 100644
---
a/airflow-core/docs/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst
+++
b/airflow-core/docs/administration-and-deployment/logging-monitoring/advanced-logging-configuration.rst
@@ -25,7 +25,7 @@ Not all configuration options are available from the
``airflow.cfg`` file. The c
how to configure logging for tasks, because the logs generated by tasks are
not only logged in separate
files by default but has to be also accessible via the webserver.
-By default standard airflow component logs are written to the
``$AIRFLOW_HOME/logs`` directory, but you
+By default standard Airflow component logs are written to the
``$AIRFLOW_HOME/logs`` directory, but you
can also customize it and configure it as you want by overriding Python logger
configuration that can
be configured by providing custom logging configuration object. You can also
create and use logging configuration
for specific operators and tasks.
diff --git
a/airflow-core/docs/administration-and-deployment/modules_management.rst
b/airflow-core/docs/administration-and-deployment/modules_management.rst
index 865dab06867..e6b8b7d14fc 100644
--- a/airflow-core/docs/administration-and-deployment/modules_management.rst
+++ b/airflow-core/docs/administration-and-deployment/modules_management.rst
@@ -318,7 +318,7 @@ try to import the package now:
>>>
We can also use :envvar:`PYTHONPATH` variable with the airflow commands.
-For example, if we run the following airflow command:
+For example, if we run the following Airflow command:
.. code-block:: bash
diff --git a/airflow-core/docs/administration-and-deployment/plugins.rst
b/airflow-core/docs/administration-and-deployment/plugins.rst
index 47c0f2fecdf..6d143c09609 100644
--- a/airflow-core/docs/administration-and-deployment/plugins.rst
+++ b/airflow-core/docs/administration-and-deployment/plugins.rst
@@ -115,7 +115,7 @@ looks like:
# A list of dictionaries containing kwargs for FlaskAppBuilder
add_link. See example below
appbuilder_menu_items = []
- # A callback to perform actions when airflow starts and the plugin is
loaded.
+ # A callback to perform actions when Airflow starts and the plugin is
loaded.
# NOTE: Ensure your plugin has *args, and **kwargs in the method
definition
# to protect against extra parameters injected into the on_load(...)
# function in future changes
@@ -192,7 +192,7 @@ definitions in Airflow.
static_url_path="/static/test_plugin",
)
- # Creating a FastAPI application to integrate in airflow Rest API.
+ # Creating a FastAPI application to integrate in Airflow Rest API.
app = FastAPI()
diff --git a/airflow-core/docs/authoring-and-scheduling/connections.rst
b/airflow-core/docs/authoring-and-scheduling/connections.rst
index 2f24e3cc83b..ba74826519e 100644
--- a/airflow-core/docs/authoring-and-scheduling/connections.rst
+++ b/airflow-core/docs/authoring-and-scheduling/connections.rst
@@ -38,7 +38,7 @@ A Hook is a high-level interface to an external platform that
lets you quickly a
They integrate with Connections to gather credentials, and many have a default
``conn_id``; for example, the
:class:`~airflow.providers.postgres.hooks.postgres.PostgresHook` automatically
looks for the Connection with a ``conn_id`` of ``postgres_default`` if you
don't pass one in.
-You can view a :ref:`full list of airflow hooks <pythonapi:hooks>` in our API
documentation.
+You can view a :ref:`full list of Airflow hooks <pythonapi:hooks>` in our API
documentation.
Custom connections
------------------
diff --git a/airflow-core/docs/authoring-and-scheduling/index.rst
b/airflow-core/docs/authoring-and-scheduling/index.rst
index fab495298c4..1b78d588bab 100644
--- a/airflow-core/docs/authoring-and-scheduling/index.rst
+++ b/airflow-core/docs/authoring-and-scheduling/index.rst
@@ -18,7 +18,7 @@
Authoring and Scheduling
=========================
-Here you can find detailed documentation about advanced authoring and
scheduling airflow dags.
+Here you can find detailed documentation about advanced authoring and
scheduling Airflow dags.
It's recommended that you first review the pages in :doc:`core concepts
</core-concepts/index>`
.. _authoring-section:
diff --git a/airflow-core/docs/best-practices.rst
b/airflow-core/docs/best-practices.rst
index 9f596a9d257..bc4889b5f20 100644
--- a/airflow-core/docs/best-practices.rst
+++ b/airflow-core/docs/best-practices.rst
@@ -451,7 +451,7 @@ for any variable that contains sensitive data.
Timetables
----------
-Avoid using Airflow Variables/Connections or accessing airflow database at the
top level of your timetable code.
+Avoid using Airflow Variables/Connections or accessing Airflow database at the
top level of your timetable code.
Database access should be delayed until the execution time of the DAG. This
means that you should not have variables/connections retrieval
as argument to your timetable class initialization or have Variable/connection
at the top level of your custom timetable module.
@@ -980,7 +980,7 @@ The benefits of the operator are:
* There is no need to prepare the venv upfront. It will be dynamically created
before task is run, and
removed after it is finished, so there is nothing special (except having
virtualenv package in your
- airflow dependencies) to make use of multiple virtual environments
+ Airflow dependencies) to make use of multiple virtual environments
* You can run tasks with different sets of dependencies on the same workers -
thus Memory resources are
reused (though see below about the CPU overhead involved in creating the
venvs).
* In bigger installations, DAG Authors do not need to ask anyone to create the
venvs for you.
diff --git a/airflow-core/docs/configurations-ref.rst
b/airflow-core/docs/configurations-ref.rst
index 2d6b9cc3cc8..e67b8e39ce2 100644
--- a/airflow-core/docs/configurations-ref.rst
+++ b/airflow-core/docs/configurations-ref.rst
@@ -29,7 +29,7 @@ should be same on the Webserver and Worker to allow Webserver
to fetch logs from
The webserver key is also used to authorize requests to Celery workers when
logs are retrieved. The token
generated using the secret key has a short expiry time though - make sure that
time on ALL the machines
-that you run airflow components on is synchronized (for example using ntpd)
otherwise you might get
+that you run Airflow components on is synchronized (for example using ntpd)
otherwise you might get
"forbidden" errors when the logs are accessed.
.. note::
diff --git a/airflow-core/docs/core-concepts/backfill.rst
b/airflow-core/docs/core-concepts/backfill.rst
index ac6729b23d0..955a3661557 100644
--- a/airflow-core/docs/core-concepts/backfill.rst
+++ b/airflow-core/docs/core-concepts/backfill.rst
@@ -20,7 +20,7 @@ Backfill
Backfill is when you create runs for past dates of a dag. Airflow provides a
mechanism
to do this through the CLI and REST API. You provide a dag, a start date, and
an end date,
-and airflow will create runs in the range according to the dag's schedule.
+and Airflow will create runs in the range according to the dag's schedule.
Backfill does not make sense for dags that don't have a time-based schedule.
diff --git a/airflow-core/docs/core-concepts/operators.rst
b/airflow-core/docs/core-concepts/operators.rst
index 0d66ba3f61d..596b10ae169 100644
--- a/airflow-core/docs/core-concepts/operators.rst
+++ b/airflow-core/docs/core-concepts/operators.rst
@@ -199,7 +199,7 @@ In some cases, you may want to exclude a string from
templating and use it direc
)
This will fail with ``TemplateNotFound: cat script.sh`` since Airflow would
treat the string as a path to a file, not a command.
-We can prevent airflow from treating this value as a reference to a file by
wrapping it in :func:`~airflow.util.template.literal`.
+We can prevent Airflow from treating this value as a reference to a file by
wrapping it in :func:`~airflow.util.template.literal`.
This approach disables the rendering of both macros and files and can be
applied to selected nested fields while retaining the default templating rules
for the remainder of the content.
.. code-block:: python
diff --git a/airflow-core/docs/core-concepts/overview.rst
b/airflow-core/docs/core-concepts/overview.rst
index 027d14da12e..a0111b59183 100644
--- a/airflow-core/docs/core-concepts/overview.rst
+++ b/airflow-core/docs/core-concepts/overview.rst
@@ -59,7 +59,7 @@ A minimal Airflow installation consists of the following
components:
* A folder of *DAG files*, which is read by the *scheduler* to figure out what
tasks to run and when to
run them.
-* A *metadata database*, which airflow components use to store state of
workflows and tasks.
+* A *metadata database*, which Airflow components use to store state of
workflows and tasks.
Setting up a metadata database is described in :doc:`/howto/set-up-database`
and is required for
Airflow to work.
diff --git a/airflow-core/docs/extra-packages-ref.rst
b/airflow-core/docs/extra-packages-ref.rst
index 5ff1fd6d0d8..9c360209ac2 100644
--- a/airflow-core/docs/extra-packages-ref.rst
+++ b/airflow-core/docs/extra-packages-ref.rst
@@ -26,18 +26,18 @@ already existing ``providers`` and the dependencies are
isolated and simplified
packages.
While the original installation methods via ``apache-airflow`` distribution
package and extras still
-work as previously and it installs complete airflow installation ready to
serve as scheduler, webserver, triggerer
+work as previously and it installs complete Airflow installation ready to
serve as scheduler, webserver, triggerer
and worker, the ``apache-airflow`` package is now a meta-package that installs
all the other distribution
packages, it's also possible to install only the distribution packages that
are needed for a specific
-component you want to run airflow with.
+component you want to run Airflow with.
The following distribution packages are available:
+----------------------------+------------------------------------------------------------------+----------------------------------------------------------+
| Distribution package | Purpose
| Optional extras |
+----------------------------+------------------------------------------------------------------+----------------------------------------------------------+
-| apache-airflow-core | This is the core distribution package that
contains | * Core extras that add optional functionality to
airflow |
-| | the airflow scheduler, webserver, triggerer
code. | core system - enhancing its functionality across
|
+| apache-airflow-core | This is the core distribution package that
contains | * Core extras that add optional functionality to
Airflow |
+| | the Airflow scheduler, webserver, triggerer
code. | core system - enhancing its functionality across
|
| |
| multiple providers. |
| |
| |
| |
| * Group ``all`` extra that installs all optional |
@@ -71,7 +71,7 @@ The following distribution packages are available:
As mentioned above, Airflow has a number of optional "extras" that you can use
to add features to your
installation when you are installing Airflow. Those extras are a good way for
the users to manage their
-installation, but also they are useful for contributors to airflow when they
want to contribute some of
+installation, but also they are useful for contributors to Airflow when they
want to contribute some of
the features - including optional integrations of Airflow - via providers.
Here's the list of all the extra dependencies of Apache Airflow.
@@ -79,7 +79,7 @@ Here's the list of all the extra dependencies of Apache
Airflow.
Core Airflow extras
-------------------
-These are core airflow extras that extend capabilities of core Airflow. They
usually do not install provider
+These are core Airflow extras that extend capabilities of core Airflow. They
usually do not install provider
packages (with the exception of ``celery`` and ``cncf.kubernetes`` extras),
they just install necessary
python dependencies for the provided package.
@@ -129,7 +129,7 @@ Providers extras
These providers extras are simply convenience extras to install providers so
that you can install the providers with simple command - including
provider package and necessary dependencies in single command, which allows
PIP to resolve any conflicting dependencies. This is extremely useful
-for first time installation where you want to repeatably install version of
dependencies which are 'valid' for both airflow and providers installed.
+for first time installation where you want to repeatably install version of
dependencies which are 'valid' for both Airflow and providers installed.
For example the below command will install:
diff --git a/airflow-core/docs/howto/email-config.rst
b/airflow-core/docs/howto/email-config.rst
index 82a3e745c1b..532f987aca7 100644
--- a/airflow-core/docs/howto/email-config.rst
+++ b/airflow-core/docs/howto/email-config.rst
@@ -88,7 +88,7 @@ Send email using SendGrid
Using Default SMTP
^^^^^^^^^^^^^^^^^^
-You can use the default airflow SMTP backend to send email with SendGrid
+You can use the default Airflow SMTP backend to send email with SendGrid
.. code-block:: ini
diff --git a/airflow-core/docs/howto/export-more-env-vars.rst
b/airflow-core/docs/howto/export-more-env-vars.rst
index e393f479302..a35b3be6fb5 100644
--- a/airflow-core/docs/howto/export-more-env-vars.rst
+++ b/airflow-core/docs/howto/export-more-env-vars.rst
@@ -23,7 +23,7 @@ Export dynamic environment variables available for operators
to use
The key value pairs returned in ``get_airflow_context_vars`` defined in
-``airflow_local_settings.py`` are injected to default airflow context
environment variables,
+``airflow_local_settings.py`` are injected to default Airflow context
environment variables,
which are available as environment variables when running tasks. Note, both
key and
value are must be string.
diff --git a/airflow-core/docs/howto/listener-plugin.rst
b/airflow-core/docs/howto/listener-plugin.rst
index 20569a3fc6f..9d139093548 100644
--- a/airflow-core/docs/howto/listener-plugin.rst
+++ b/airflow-core/docs/howto/listener-plugin.rst
@@ -44,14 +44,14 @@ Using this plugin, following events can be listened:
* dag run is in running state.
* dag run is in success state.
* dag run is in failure state.
- * on start before event like airflow job, scheduler
- * before stop for event like airflow job, scheduler
+ * on start before event like Airflow job, scheduler
+ * before stop for event like Airflow job, scheduler
Listener Registration
---------------------
A listener plugin with object reference to listener object is registered
-as part of airflow plugin. The following is a
+as part of Airflow plugin. The following is a
skeleton for us to implement a new listener:
.. code-block:: python
diff --git a/airflow-core/docs/howto/set-config.rst
b/airflow-core/docs/howto/set-config.rst
index b25bf3470a8..2ff69415964 100644
--- a/airflow-core/docs/howto/set-config.rst
+++ b/airflow-core/docs/howto/set-config.rst
@@ -164,7 +164,7 @@ the example below.
The webserver key is also used to authorize requests to Celery workers
when logs are retrieved. The token
generated using the secret key has a short expiry time though - make sure
that time on ALL the machines
- that you run airflow components on is synchronized (for example using
ntpd) otherwise you might get
+ that you run Airflow components on is synchronized (for example using
ntpd) otherwise you might get
"forbidden" errors when the logs are accessed.
.. _set-config:configuring-local-settings:
diff --git a/airflow-core/docs/howto/usage-cli.rst
b/airflow-core/docs/howto/usage-cli.rst
index a3bb13e0759..82005804b10 100644
--- a/airflow-core/docs/howto/usage-cli.rst
+++ b/airflow-core/docs/howto/usage-cli.rst
@@ -44,7 +44,7 @@ For permanent (but not global) airflow activation, use:
register-python-argcomplete airflow >> ~/.bashrc
-For one-time activation of argcomplete for airflow only, use:
+For one-time activation of argcomplete for Airflow only, use:
.. code-block:: bash
diff --git a/airflow-core/docs/installation/dependencies.rst
b/airflow-core/docs/installation/dependencies.rst
index 515b2a9d3c8..e41e03ab46a 100644
--- a/airflow-core/docs/installation/dependencies.rst
+++ b/airflow-core/docs/installation/dependencies.rst
@@ -23,7 +23,7 @@ Airflow extra dependencies
The ``apache-airflow`` PyPI basic package only installs what's needed to get
started.
Additional packages can be installed depending on what will be useful in your
-environment. For instance, if you don't need connectivity with Postgres,
+environment. For instance, if you don't need connectivity with PostgreSQL,
you won't have to go through the trouble of installing the ``postgres-devel``
yum package, or whatever equivalent applies on the distribution you are using.
@@ -55,7 +55,7 @@ Just to prevent confusion of extras versus providers: Extras
and providers are d
though many extras are leading to installing providers.
Extras are standard Python setuptools feature that allows to add additional
set of dependencies as
-optional features to "core" Apache Airflow. One of the type of such optional
features are providers
+optional features to "core" Apache Airflow. One type of such optional features
is providers
packages, but not all optional features of Apache Airflow have corresponding
providers.
We are using the ``extras`` setuptools features to also install providers.
diff --git a/airflow-core/docs/installation/index.rst
b/airflow-core/docs/installation/index.rst
index 492dd76c68c..a72f44fb40f 100644
--- a/airflow-core/docs/installation/index.rst
+++ b/airflow-core/docs/installation/index.rst
@@ -35,12 +35,12 @@ Installation of Airflow®
Upgrading <upgrading>
Upgrading to Airflow 3 <upgrading_to_airflow3>
-This page describes installations options that you might use when considering
how to install Airflow®.
+This page describes installation options that you might use when considering
how to install Airflow®.
Airflow consists of many components, often distributed among many physical or
virtual machines, therefore
installation of Airflow might be quite complex, depending on the options you
choose.
-You should also check-out the :doc:`Prerequisites <prerequisites>` that must
be fulfilled when installing Airflow
-as well as :doc:`Supported versions <supported-versions>` to know what are
the policies for supporting
+You should also check out the :doc:`Prerequisites <prerequisites>` that must
be fulfilled when installing Airflow
+as well as :doc:`Supported versions <supported-versions>` to know what are
the policies for the supporting
Airflow, Python and Kubernetes.
Airflow requires additional :doc:`Dependencies <dependencies>` to be installed
- which can be done
@@ -68,9 +68,9 @@ More details: :doc:`installing-from-sources`
**What are you expected to handle**
-* You are expected to build and install airflow and its components on your own.
+* You are expected to build and install Airflow and its components on your own.
* You should develop and handle the deployment for all components of Airflow.
-* You are responsible for setting up database, creating and managing database
schema with ``airflow db`` commands,
+* You are responsible for setting up the database, creating and managing
database schema with ``airflow db`` commands,
automated startup and recovery, maintenance, cleanup and upgrades of Airflow
and the Airflow Providers.
* You need to setup monitoring of your system allowing you to observe
resources and react to problems.
* You are expected to configure and manage appropriate resources for the
installation (memory, CPU, etc) based
@@ -84,14 +84,14 @@ More details: :doc:`installing-from-sources`
**Where to ask for help**
-* The ``#user-troubleshooting`` channel on slack can be used for quick general
troubleshooting questions. The
+* The ``#user-troubleshooting`` channel on Slack can be used for quick general
troubleshooting questions. The
`GitHub discussions <https://github.com/apache/airflow/discussions>`__ if
you look for longer discussion and have more information to share.
-* The ``#user-best-practices`` channel on slack can be used to ask for and
share best practices on using and deploying airflow.
+* The ``#user-best-practices`` channel on Slack can be used to ask for and
share best practices on using and deploying Airflow.
* If you can provide description of a reproducible problem with Airflow
software, you can open issue at `GitHub issues
<https://github.com/apache/airflow/issues>`_
-* If you want to contribute back to Airflow, the ``#contributors`` slack
channel for building the Airflow itself
+* If you want to contribute back to Airflow, the ``#contributors`` Slack
channel for building the Airflow itself
Using PyPI
@@ -103,7 +103,7 @@ More details: :doc:`/installation/installing-from-pypi`
* This installation method is useful when you are not familiar with Containers
and Docker and want to install
Apache Airflow on physical or virtual machines and you are used to
installing and running software using custom
- deployment mechanism.
+ deployment mechanisms.
* The only officially supported mechanism of installation is via ``pip`` using
constraint mechanisms. The constraint
files are managed by Apache Airflow release managers to make sure that you
can repeatably install Airflow from PyPI with all Providers and
@@ -122,7 +122,7 @@ More details: :doc:`/installation/installing-from-pypi`
* You are expected to install Airflow - all components of it - on your own.
* You should develop and handle the deployment for all components of Airflow.
-* You are responsible for setting up database, creating and managing database
schema with ``airflow db`` commands,
+* You are responsible for setting up the database, creating and managing
database schema with ``airflow db`` commands,
automated startup and recovery, maintenance, cleanup and upgrades of Airflow
and Airflow Providers.
* You need to setup monitoring of your system allowing you to observe
resources and react to problems.
* You are expected to configure and manage appropriate resources for the
installation (memory, CPU, etc) based
@@ -144,8 +144,8 @@ More details: :doc:`/installation/installing-from-pypi`
* The ``#user-troubleshooting`` channel on Airflow Slack for quick general
troubleshooting questions. The `GitHub discussions
<https://github.com/apache/airflow/discussions>`__
if you look for longer discussion and have more information to share.
-* The ``#user-best-practices`` channel on slack can be used to ask for and
share best
- practices on using and deploying airflow.
+* The ``#user-best-practices`` channel on Slack can be used to ask for and
share best
+ practices on using and deploying Airflow.
* If you can provide description of a reproducible problem with Airflow
software, you can open
issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
@@ -162,7 +162,7 @@ running Airflow components in isolation from other software
running on the same
maintenance of dependencies.
The images are built by Apache Airflow release managers and they use
officially released packages from PyPI
-and official constraint files- same that are used for installing Airflow from
PyPI.
+and official constraint files - same that are used for installing Airflow from
PyPI.
**Intended users**
@@ -174,16 +174,16 @@ and official constraint files- same that are used for
installing Airflow from Py
* You are expected to be able to customize or extend Container/Docker images
if you want to
add extra dependencies. You are expected to put together a deployment built
of several containers
- (for example using docker-compose) and to make sure that they are linked
together.
-* You are responsible for setting up database, creating and managing database
schema with ``airflow db`` commands,
+ (for example using ``docker-compose``) and to make sure that they are linked
together.
+* You are responsible for setting up the database, creating and managing
database schema with ``airflow db`` commands,
automated startup and recovery, maintenance, cleanup and upgrades of Airflow
and the Airflow Providers.
* You are responsible to manage your own customizations and extensions for
your custom dependencies.
With the Official Airflow Docker Images, upgrades of Airflow and Airflow
Providers which
are part of the reference image are handled by the community - you need to
make sure to pick up
- those changes when released by upgrading the base image. However, you are
responsible in creating a
+ those changes when released by upgrading the base image. However, you are
responsible for creating a
pipeline of building your own custom images with your own added dependencies
and Providers and need to
repeat the customization step and building your own image when new version
of Airflow image is released.
-* You should choose the right deployment mechanism. There a number of
available options of
+* You should choose the right deployment mechanism. There are a number of
available options of
deployments of containers. You can use your own custom mechanism, custom
Kubernetes deployments,
custom Docker Compose, custom Helm charts etc., and you should choose it
based on your experience
and expectations.
@@ -208,8 +208,8 @@ and official constraint files- same that are used for
installing Airflow from Py
* The ``#user-troubleshooting`` channel on Airflow Slack for quick general
troubleshooting questions. The `GitHub discussions
<https://github.com/apache/airflow/discussions>`__
if you look for longer discussion and have more information to share.
-* The ``#user-best-practices`` channel on slack can be used to ask for and
share best
- practices on using and deploying airflow.
+* The ``#user-best-practices`` channel on Slack can be used to ask for and
share best
+ practices on using and deploying Airflow.
* If you can provide description of a reproducible problem with Airflow
software, you can open
issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
@@ -246,7 +246,7 @@ More details: :doc:`helm-chart:index`
* You are responsible to manage your own customizations and extensions for
your custom dependencies.
With the Official Airflow Docker Images, upgrades of Airflow and Airflow
Providers which
are part of the reference image are handled by the community - you need to
make sure to pick up
- those changes when released by upgrading the base image. However, you are
responsible in creating a
+ those changes when released by upgrading the base image. However, you are
responsible for creating a
pipeline of building your own custom images with your own added dependencies
and Providers and need to
repeat the customization step and building your own image when new version
of Airflow image is released.
* You need to setup monitoring of your system allowing you to observe
resources and react to problems.
@@ -267,8 +267,8 @@ More details: :doc:`helm-chart:index`
* The ``#user-troubleshooting`` channel on Airflow Slack for quick general
troubleshooting questions. The `GitHub discussions
<https://github.com/apache/airflow/discussions>`__
if you look for longer discussion and have more information to share.
-* The ``#user-best-practices`` channel on slack can be used to ask for and
share best
- practices on using and deploying airflow.
+* The ``#user-best-practices`` channel on Slack can be used to ask for and
share best
+ practices on using and deploying Airflow.
* If you can provide description of a reproducible problem with Airflow
software, you can open
issue at `GitHub issues <https://github.com/apache/airflow/issues>`__
@@ -372,15 +372,15 @@ control theory - where there are two types of systems:
2. Complex systems with multiple variables, that are hard to predict and where
you need to monitor
the system and adjust the knobs continuously to make sure the system is
running smoothly.
-Airflow (and generally any modern system running usually on cloud services,
with multiple layers responsible
-for resources as well multiple parameters to control their behaviour) is a
complex system and they fall
+Airflow (and generally any modern systems running usually on cloud services,
with multiple layers responsible
+for resources as well multiple parameters to control their behaviour) is a
complex system and it fall
much more in the second category. If you decide to run Airflow in production
on your own, you should be
prepared for the monitor/observe/adjust feedback loop to make sure the system
is running smoothly.
Having a good monitoring system that will allow you to monitor the system and
adjust the parameters
is a must to put that in practice.
-There are few guidelines that you can use for optimizing your resource usage
as well. The
+There are a few guidelines that you can use for optimizing your resource usage
as well. The
:ref:`fine-tuning-scheduler` is a good starting point to fine-tune your
scheduler, you can also follow
the :ref:`best_practice` guide to make sure you are using Airflow in the most
efficient way.
diff --git a/airflow-core/docs/installation/installing-from-pypi.rst
b/airflow-core/docs/installation/installing-from-pypi.rst
index 542c7463fde..1a70723842a 100644
--- a/airflow-core/docs/installation/installing-from-pypi.rst
+++ b/airflow-core/docs/installation/installing-from-pypi.rst
@@ -31,16 +31,12 @@ Only ``pip`` installation is currently officially supported.
While there are some successes with using other tools like `poetry
<https://python-poetry.org/>`_ or
`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the
same workflow as
``pip`` - especially when it comes to constraint vs. requirements management.
- Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If
you wish to install airflow
+ Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If
you wish to install Airflow
using those tools you should use the constraints and convert them to
appropriate
format and workflow that your tool requires.
- There are known issues with ``bazel`` that might lead to circular
dependencies when using it to install
- Airflow. Please switch to ``pip`` if you encounter such problems. ``Bazel``
community works on fixing
- the problem in `this PR
<https://github.com/bazelbuild/rules_python/pull/1166>`_ so it might be that
- newer versions of ``bazel`` will handle it.
-Typical command to install airflow from scratch in a reproducible way from
PyPI looks like below:
+Typical command to install Airflow from scratch in a reproducible way from
PyPI looks like below:
.. code-block:: bash
@@ -63,12 +59,12 @@ Those are just examples, see further for more explanation
why those are the best
.. note::
Generally speaking, Python community established practice is to perform
application installation in a
- virtualenv created with ``virtualenv`` or ``venv`` tools. You can also use
``uv`` or ``pipx`` to install
+ virtual environment created with ``virtualenv`` or ``venv`` tools. You can
also use ``uv`` or ``pipx`` to install
Airflow in application dedicated virtual environment created for you. There
are also other tools that can be used
- to manage your virtualenv installation and you are free to choose how you
are managing the environments.
- Airflow has no limitation regarding to the tool of your choice when it
comes to virtual environment.
+ to manage your virtual environment installation and you are free to choose
how you are managing the environments.
+ Airflow has no limitation regarding the tool of your choice when it comes
to virtual environment.
- The only exception where you might consider not using virtualenv is when
you are building a container
+ The only exception where you might consider not using virtual environment
is when you are building a container
image with only Airflow installed - this is for example how Airflow is
installed in the official Container
image.
@@ -84,7 +80,7 @@ Airflow® installation can be tricky because Airflow is both a
library and an ap
Libraries usually keep their dependencies open and applications usually pin
them, but we should do neither
and both at the same time. We decided to keep our dependencies as open as
possible
-(in ``pyproject.toml``) so users can install different version of libraries if
needed. This means that
+(in ``pyproject.toml``) so users can install different versions of libraries
if needed. This means that
from time to time plain ``pip install apache-airflow`` will not work or will
produce an unusable
Airflow installation.
@@ -96,7 +92,7 @@ In order to have a reproducible installation, we also keep a
set of constraint f
for each released version e.g. :subst-code:`constraints-|version|`.
This way, we keep a tested set of dependencies at the moment of release. This
provides you with the ability
-of having the exact same installation of airflow + providers + dependencies as
was known to be working
+of having the exact same installation of Airflow + providers + dependencies as
was known to be working
at the moment of release - frozen set of dependencies for that version of
Airflow. There is a separate
constraints file for each version of Python that Airflow supports.
@@ -111,7 +107,7 @@ where:
- ``AIRFLOW_VERSION`` - Airflow version (e.g. :subst-code:`|version|`) or
``main``, ``2-0``, for latest development version
- ``PYTHON_VERSION`` Python version e.g. ``3.9``, ``3.10``
-The examples below assume that you want to use install airflow in a
reproducible way with the ``celery`` extra,
+The examples below assume that you want to use install Airflow in a
reproducible way with the ``celery`` extra,
but you can pick your own set of extras and providers to install.
.. code-block:: bash
@@ -135,7 +131,7 @@ Upgrading and installing dependencies (including providers)
providers and other dependencies to other versions**
You can, for example, install new versions of providers and dependencies after
the release to use the latest
-version and up-to-date with latest security fixes - even if you do not want
upgrade airflow core version.
+version and up-to-date with latest security fixes - even if you do not want
upgrade Airflow core version.
Or you can downgrade some dependencies or providers if you want to keep
previous versions for compatibility
reasons. Installing such dependencies should be done without constraints as a
separate pip command.
@@ -184,7 +180,7 @@ consistent and not conflicting.
No broken requirements found.
-When you see such message and the exit code from ``pip check`` is 0, you can
be sure, that there are no
+When you see such message and the exit code from ``pip check`` is 0, you can
be sure that there are no
conflicting dependencies in your environment.
@@ -211,7 +207,7 @@ a local constraints file:
pip install "apache-airflow[celery]==|version|" --constraint
"my-constraints.txt"
-Similarly as in case of Airflow original constraints, you can also host your
constraints at your own
+Similarly as in the case of Airflow original constraints, you can also host
your constraints at your own
repository or server and use it remotely from there.
Fixing Constraints at release time
@@ -219,7 +215,7 @@ Fixing Constraints at release time
The released "versioned" constraints are mostly ``fixed`` when we release
Airflow version and we only
update them in exceptional circumstances. For example when we find out that
the released constraints might prevent
-Airflow from being installed consistently from the scratch.
+Airflow from being installed consistently from scratch.
In normal circumstances, the constraint files are not going to change if new
version of Airflow
dependencies are released - not even when those versions contain critical
security fixes.
@@ -260,7 +256,7 @@ providers in case they were released afterwards.
Upgrading Airflow together with providers
=========================================
-You can upgrade airflow together with extras (providers available at the time
of the release of Airflow
+You can upgrade Airflow together with extras (providers available at the time
of the release of Airflow
being installed. This will bring ``apache-airflow`` and all providers to the
versions that were
released and tested together when the version of Airflow you are installing
was released.
@@ -291,7 +287,7 @@ Constraints are only effective during the ``pip install``
command they were used
It is the best practice to install apache-airflow in the same version as the
one that comes from the
original image. This way you can be sure that ``pip`` will not try to
downgrade or upgrade apache
-airflow while installing other requirements, which might happen in case you
try to add a dependency
+Airflow while installing other requirements, which might happen in case you
try to add a dependency
that conflicts with the version of apache-airflow that you are using:
.. code-block:: bash
@@ -314,11 +310,11 @@ Managing just Airflow core without providers
============================================
If you don't want to install any providers you have, just install or upgrade
Apache Airflow, you can simply
-install airflow in the version you need. You can use the special
``constraints-no-providers`` constraints
+install Airflow in the version you need. You can use the special
``constraints-no-providers`` constraints
file, which is smaller and limits the dependencies to the core of Airflow
only, however this can lead to
conflicts if your environment already has some of the dependencies installed
in different versions and
in case you have other providers installed. This command, however, gives you
the latest versions of
-dependencies compatible with just airflow core at the moment Airflow was
released.
+dependencies compatible with just Airflow core at the moment Airflow was
released.
.. code-block:: bash
:substitutions:
@@ -345,7 +341,7 @@ ensure that ``~/.local/bin`` is in your ``PATH``
environment variable, and add i
PATH=$PATH:~/.local/bin
-You can also start airflow with ``python -m airflow``
+You can also start Airflow with ``python -m airflow``
Symbol not found: ``_Py_GetArgcArgv``
=====================================
diff --git a/airflow-core/docs/installation/installing-from-sources.rst
b/airflow-core/docs/installation/installing-from-sources.rst
index 949d949df2a..bafd3df9b33 100644
--- a/airflow-core/docs/installation/installing-from-sources.rst
+++ b/airflow-core/docs/installation/installing-from-sources.rst
@@ -26,18 +26,18 @@ Released packages
This page describes downloading and verifying Airflow® version
``{{ airflow_version }}`` using officially released packages.
You can also install ``Apache Airflow`` - as most Python packages - via
:doc:`PyPI <installing-from-pypi>`.
- You can choose different version of Airflow by selecting different version
from the drop-down at
+ You can choose different version of Airflow by selecting a different
version from the drop-down at
the top-left of the page.
The ``source``, ``sdist`` and ``whl`` packages released are the "official"
sources of installation that you
can use if you want to verify the origin of the packages and want to verify
checksums and signatures of
the packages. The packages are available via the
-`Official Apache Software Foundations Downloads <https://dlcdn.apache.org/>`_
+`Official Apache Software Foundation Downloads <https://dlcdn.apache.org/>`_
As of version 2.8 Airflow follows PEP 517/518 and uses ``pyproject.toml`` file
to define build dependencies
-and build process and it requires relatively modern versions of packaging
tools to get airflow built from
+and build process and it requires relatively modern versions of packaging
tools to get Airflow built from
local sources or ``sdist`` packages, as PEP 517 compliant build hooks are used
to determine dynamic build
-dependencies. In case of ``pip`` it means that at least version 22.1.0 is
needed (released at the beginning of
+dependencies. In case of ``pip``, it means that at least version 22.1.0 is
needed (released at the beginning of
2022) to build or install Airflow from sources. This does not affect the
ability of installing Airflow from
released wheel packages.
@@ -116,7 +116,7 @@ Example:
The "Good signature from ..." is indication that the signatures are correct.
Do not worry about the "not certified with a trusted signature" warning. Most
of the certificates used
-by release managers are self signed, that's why you get this warning. By
importing the server in the
+by release managers are self-signed, that's why you get this warning. By
importing the server in the
previous step and importing it via ID from ``KEYS`` page, you know that this
is a valid Key already.
For SHA512 sum check, download the relevant ``sha512`` and run the following:
@@ -159,5 +159,5 @@ and SHA sum files with the script below:
ls -la "${airflow_download_dir}"
echo
-Once you verify the files following the instructions from previous chapter you
can remove the temporary
+Once you verify the files following the instructions from previous section,
you can remove the temporary
folder created.
diff --git a/airflow-core/docs/installation/prerequisites.rst
b/airflow-core/docs/installation/prerequisites.rst
index 3342b63bf15..323af74f8e5 100644
--- a/airflow-core/docs/installation/prerequisites.rst
+++ b/airflow-core/docs/installation/prerequisites.rst
@@ -30,8 +30,7 @@ Airflow® is tested with:
* Kubernetes: 1.26, 1.27, 1.28, 1.29, 1.30
-The minimum memory required we recommend Airflow to run with is 4GB, but the
actual requirements depend
-wildly on the deployment options you have
+While we recommend a minimum of 4GB of memory for Airflow, the actual
requirements heavily depend on your chosen deployment.
.. warning::
@@ -43,17 +42,17 @@ wildly on the deployment options you have
because the number of users who tried to use MariaDB for Airflow is very
small.
.. warning::
- SQLite is used in Airflow tests. Do not use it in production. We recommend
+ SQLite is used in Airflow tests. DO NOT use it in production. We recommend
using the latest stable version of SQLite for local development.
.. warning::
Airflow® currently can be run on POSIX-compliant Operating Systems. For
development it is regularly
- tested on fairly modern Linux Distros that our contributors use and recent
versions of MacOS.
+ tested on fairly modern Linux distributions that our contributors use and
recent versions of MacOS.
On Windows you can run it via WSL2 (Windows Subsystem for Linux 2) or via
Linux Containers.
The work to add Windows support is tracked via `#10388
<https://github.com/apache/airflow/issues/10388>`__
- but it is not a high priority. You should only use Linux-based distros as
"Production" execution environment
- as this is the only environment that is supported. The only distro that is
used in our CI tests and that
+ but it is not a high priority. You should only use Linux-based distributions
as "Production environment"
+ as this is the only environment that is supported. The only distribution
that is used in our CI tests and that
is used in the `Community managed DockerHub image
<https://hub.docker.com/p/apache/airflow>`__ is
``Debian Bookworm``.
diff --git a/airflow-core/docs/installation/supported-versions.rst
b/airflow-core/docs/installation/supported-versions.rst
index 6c3f8952b95..b649fce1b41 100644
--- a/airflow-core/docs/installation/supported-versions.rst
+++ b/airflow-core/docs/installation/supported-versions.rst
@@ -39,28 +39,28 @@ Version Current Patch/Minor State First Release
Limited Support
.. End of auto-generated table
-Limited support versions will be supported with security and critical bug fix
only.
-EOL versions will not get any fixes nor support.
+Limited support versions will be supported with security and critical bug
fixes only.
+EOL versions will not get any fixes or support.
We **highly** recommend installing the latest Airflow release which has richer
features.
Support for Python and Kubernetes versions
``````````````````````````````````````````
-As of Airflow 2.0 we agreed to certain rules we follow for Python and
Kubernetes support.
+For Airflow 2.0+ versions, we agreed to certain rules we follow for Python and
Kubernetes support.
They are based on the official release schedule of Python and Kubernetes,
nicely summarized in the
`Python Developer's Guide
<https://devguide.python.org/#status-of-python-branches>`_ and
`Kubernetes version skew policy
<https://kubernetes.io/docs/setup/release/version-skew-policy>`_.
1. We drop support for Python and Kubernetes versions when they reach EOL. We
drop support for those
- EOL versions in main right after EOL date, and it is effectively removed
when we release the
+ EOL versions in main right after the EOL date, and it is effectively
removed when we release the
first new MINOR (Or MAJOR if there is no new MINOR version) of Airflow
For example for Python 3.6 it means that we drop support in main right
after 23.12.2021, and the first
MAJOR or MINOR version of Airflow released after will not have it.
2. The "oldest" supported version of Python/Kubernetes is the default one.
"Default" is only meaningful
in terms of "smoke tests" in CI PRs which are run using this default
version and default reference
- image available in DockerHub. Currently the ``apache/airflow:latest`` and
``apache/airflow:2.10.2`` images
+ image available in Docker Hub. Currently the ``apache/airflow:latest`` and
``apache/airflow:2.10.2`` images
are Python 3.8 images, however, in the first MINOR/MAJOR release of Airflow
released after 2024-10-14,
they will become Python 3.9 images.
diff --git a/airflow-core/docs/installation/upgrading_to_airflow3.rst
b/airflow-core/docs/installation/upgrading_to_airflow3.rst
index 355ae58a9d3..208f4ba5cd4 100644
--- a/airflow-core/docs/installation/upgrading_to_airflow3.rst
+++ b/airflow-core/docs/installation/upgrading_to_airflow3.rst
@@ -33,7 +33,7 @@ Step 2: Clean and back up your existing Airflow Instance
- It is highly recommended to make a backup of your Airflow instance
specifically including your Airflow metadata DB before starting the migration
process.
- If you do not have a "hot backup" capability for your DB, you should do it
after shutting down your Airflow instances, so that the backup of your database
will be consistent.
-- If you did not make a backup and your migration fails, you might end-up in a
half-migrated state and restoring DB from backup and repeating the migration
+- If you did not make a backup and your migration fails, you might end up in a
half-migrated state and restoring DB from backup and repeating the migration
might be the only easy way out. This can for example be caused by a broken
network connection between your CLI and the database while the migration
happens, so taking a
backup is an important precaution to avoid problems like this.
- A long running Airflow instance can accumulate a certain amount of silt, in
the form of old database entries, which are no longer
@@ -134,7 +134,7 @@ These include:
- **Sequential Executor**: Replaced by LocalExecutor, which can be used with
SQLite for local development use cases.
- **SLAs**: Deprecated and removed; Will be replaced by forthcoming `Deadline
Alerts <https://cwiki.apache.org/confluence/x/tglIEw>`_.
- **Subdir**: Used as an argument on many CLI commands (``--subdir`` or ``-S``
has been superseded by DAG bundles.
-- **Following keys are no longer available in task context. If not replaced,
will cause DAG errors:**:
+- **Following keys are no longer available in task context. If not replaced,
will cause DAG errors**:
- ``tomorrow_ds``
- ``tomorrow_ds_nodash``
diff --git a/airflow-core/docs/templates-ref.rst
b/airflow-core/docs/templates-ref.rst
index ccd2d08c889..fe292501a58 100644
--- a/airflow-core/docs/templates-ref.rst
+++ b/airflow-core/docs/templates-ref.rst
@@ -186,7 +186,7 @@ Variable Description
``macros.random`` The standard lib's :class:`random.random`
=================================
==============================================
-Some airflow specific macros are also defined:
+Some Airflow specific macros are also defined:
.. automodule:: airflow.macros
:members: