This is an automated email from the ASF dual-hosted git repository.
uranusjr pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/main by this push:
new c5dfb1145c0 Use explicit directives instead of implicit syntax (#50870)
c5dfb1145c0 is described below
commit c5dfb1145c04e88eb6dcdf7e2aae1eef5caafefa
Author: Tzu-ping Chung <[email protected]>
AuthorDate: Wed May 21 18:06:25 2025 +0800
Use explicit directives instead of implicit syntax (#50870)
---
airflow-core/docs/best-practices.rst | 3 +--
airflow-core/docs/howto/docker-compose/index.rst | 12 +++++++-----
airflow-core/docs/howto/dynamic-dag-generation.rst | 3 ++-
airflow-core/docs/installation/upgrading_to_airflow3.rst | 2 +-
airflow-core/docs/public-airflow-interface.rst | 9 +++++----
chart/docs/index.rst | 2 ++
chart/docs/installing-helm-chart-from-sources.rst | 7 +++----
providers/amazon/docs/operators/athena/index.rst | 3 ++-
8 files changed, 23 insertions(+), 18 deletions(-)
diff --git a/airflow-core/docs/best-practices.rst
b/airflow-core/docs/best-practices.rst
index 268f3e7150f..28c3285339a 100644
--- a/airflow-core/docs/best-practices.rst
+++ b/airflow-core/docs/best-practices.rst
@@ -296,8 +296,6 @@ When you execute that code you will see:
This means that the ``get_array`` is not executed as top-level code, but
``get_task_id`` is.
-.. _best_practices/dynamic_dag_generation:
-
Code Quality and Linting
------------------------
@@ -351,6 +349,7 @@ By integrating ``ruff`` into your development workflow, you
can proactively addr
For more information on ``ruff`` and its integration with Airflow, refer to
the `official Airflow documentation
<https://airflow.apache.org/docs/apache-airflow/stable/best-practices.html>`_.
+.. _best_practices/dynamic_dag_generation:
Dynamic DAG Generation
----------------------
diff --git a/airflow-core/docs/howto/docker-compose/index.rst
b/airflow-core/docs/howto/docker-compose/index.rst
index df46066a0bc..0d5e2a22bb6 100644
--- a/airflow-core/docs/howto/docker-compose/index.rst
+++ b/airflow-core/docs/howto/docker-compose/index.rst
@@ -307,11 +307,13 @@ Examples of how you can extend the image with custom
providers, python packages,
apt packages and more can be found in :doc:`Building the image
<docker-stack:build>`.
.. note::
- Creating custom images means that you need to maintain also a level of
automation as you need to re-create the images
- when either the packages you want to install or Airflow is upgraded. Please
do not forget about keeping these scripts.
- Also keep in mind, that in cases when you run pure Python tasks, you can
use the
- `Python Virtualenv functions <_howto/operator:PythonVirtualenvOperator>`_
which will
- dynamically source and install python dependencies during runtime. With
Airflow 2.8.0 Virtualenvs can also be cached.
+ Creating custom images means that you need to maintain also a level of
+ automation as you need to re-create the images when either the packages you
+ want to install or Airflow is upgraded. Please do not forget about keeping
+ these scripts. Also keep in mind, that in cases when you run pure Python
+ tasks, you can use :ref:`Python Virtualenv functions
<howto/operator:PythonVirtualenvOperator>`,
+ which will dynamically source and install python dependencies during
runtime.
+ With Airflow 2.8.0, virtualenvs can also be cached.
Special case - adding dependencies via requirements.txt file
============================================================
diff --git a/airflow-core/docs/howto/dynamic-dag-generation.rst
b/airflow-core/docs/howto/dynamic-dag-generation.rst
index 814b620ea71..734e89f5d80 100644
--- a/airflow-core/docs/howto/dynamic-dag-generation.rst
+++ b/airflow-core/docs/howto/dynamic-dag-generation.rst
@@ -40,7 +40,8 @@ If you want to use variables to configure your code, you
should always use
`environment variables
<https://wiki.archlinux.org/title/environment_variables>`_ in your
top-level code rather than :doc:`Airflow Variables
</core-concepts/variables>`. Using Airflow Variables
in top-level code creates a connection to the metadata DB of Airflow to fetch
the value, which can slow
-down parsing and place extra load on the DB. See the `best practices on
Airflow Variables <best_practice:airflow_variables>`_
+down parsing and place extra load on the DB. See
+:ref:`best practices on Airflow Variables <best_practices/airflow_variables>`
to make the best use of Airflow Variables in your dags using Jinja templates.
For example you could set ``DEPLOYMENT`` variable differently for your
production and development
diff --git a/airflow-core/docs/installation/upgrading_to_airflow3.rst
b/airflow-core/docs/installation/upgrading_to_airflow3.rst
index f8f932f5476..3f9b3e399d4 100644
--- a/airflow-core/docs/installation/upgrading_to_airflow3.rst
+++ b/airflow-core/docs/installation/upgrading_to_airflow3.rst
@@ -71,7 +71,7 @@ Some changes can be automatically fixed. To do so, run the
following command:
ruff check dag/ --select AIR301 --fix --preview
-You can also configure these flags through configuration files. See
`Configuring Ruff <Configuring Ruff>`_ for details.
+You can also configure these flags through configuration files. See
`Configuring Ruff <https://docs.astral.sh/ruff/configuration/>`_ for details.
Step 4: Install the Standard Providers
--------------------------------------
diff --git a/airflow-core/docs/public-airflow-interface.rst
b/airflow-core/docs/public-airflow-interface.rst
index 337413ce802..88eed605eef 100644
--- a/airflow-core/docs/public-airflow-interface.rst
+++ b/airflow-core/docs/public-airflow-interface.rst
@@ -46,9 +46,9 @@ MAJOR version of Airflow. On the other hand, classes and
methods starting with `
as protected Python methods) and ``__`` (also known as private Python methods)
are not part of the Public
Airflow Interface and might change at any time.
-You can also use Airflow's Public Interface via the `Stable REST API
<stable-rest-api-ref>`_ (based on the
+You can also use Airflow's Public Interface via the :doc:`Stable REST API
<stable-rest-api-ref>` (based on the
OpenAPI specification). For specific needs you can also use the
-`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref>`_ though its
behaviour might change
+:doc:`Airflow Command Line Interface (CLI) <cli-and-env-variables-ref>` though
its behaviour might change
in details (such as output format and available flags) so if you want to rely
on those in programmatic
way, the Stable REST API is recommended.
@@ -407,11 +407,12 @@ Everything not mentioned in this document should be
considered as non-Public Int
Sometimes in other applications those components could be relied on to keep
backwards compatibility,
but in Airflow they are not parts of the Public Interface and might change any
time:
-* `Database structure <database-erd-ref>`_ is considered to be an internal
implementation
+* :doc:`Database structure <database-erd-ref>` is considered to be an internal
implementation
detail and you should not assume the structure is going to be maintained in a
backwards-compatible way.
-* `Web UI <ui>`_ is continuously evolving and there are no backwards
compatibility guarantees on HTML elements.
+* :doc:`Web UI <ui>` is continuously evolving and there are no backwards
+ compatibility guarantees on HTML elements.
* Python classes except those explicitly mentioned in this document, are
considered an
internal implementation detail and you should not assume they will be
maintained
diff --git a/chart/docs/index.rst b/chart/docs/index.rst
index 3a9d1ad4e46..77ba613e66e 100644
--- a/chart/docs/index.rst
+++ b/chart/docs/index.rst
@@ -81,6 +81,8 @@ Features
* Kerberos secure configuration
* One-command deployment for any type of executor. You don't need to provide
other services e.g. Redis/Database to test the Airflow.
+.. _helm_chart_install:
+
Installing the Chart
--------------------
diff --git a/chart/docs/installing-helm-chart-from-sources.rst
b/chart/docs/installing-helm-chart-from-sources.rst
index 15af0a4487b..66b9a912a68 100644
--- a/chart/docs/installing-helm-chart-from-sources.rst
+++ b/chart/docs/installing-helm-chart-from-sources.rst
@@ -16,7 +16,7 @@
under the License.
Installing Helm Chart from sources
-----------------------------------
+==================================
Released packages
'''''''''''''''''
@@ -24,9 +24,8 @@ Released packages
.. jinja:: official_download_page
This page describes downloading and verifying ``Apache Airflow Official
Helm Chart`` version
- ``{{ package_version}}`` using officially released source packages. You
can also install the chart
- directly from the ``airflow.apache.org`` repo as described in
- `Installing the chart <index#installing-the-chart>`_.
+ ``{{ package_version }}`` using officially released source packages. You
can also install the chart
+ directly from the ``airflow.apache.org`` repo as described in
:ref:`helm_chart_install`.
You can choose different version of the chart by selecting different
version from the drop-down at
the top-left of the page.
diff --git a/providers/amazon/docs/operators/athena/index.rst
b/providers/amazon/docs/operators/athena/index.rst
index 85130aba62e..8641cd715e0 100644
--- a/providers/amazon/docs/operators/athena/index.rst
+++ b/providers/amazon/docs/operators/athena/index.rst
@@ -38,7 +38,8 @@ Airflow offers two ways to query data using Amazon Athena.
**Amazon Athena SQL (DB API Connection):** Opt for this if you need to execute
multiple queries in the same operator and it's essential to retrieve and
process query results directly in Airflow, such as for sensing values or
further data manipulation.
.. note::
- Both connection methods uses `Amazon Web Services Connection
<../../connections/aws>`_ under the hood for authentication.
+ Both connection methods uses :doc:`Amazon Web Services Connection
<../../connections/aws>`
+ under the hood for authentication.
You should decide which connection method to use based on your use case.
.. toctree::