This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-11-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-11-test by this push:
     new 242119c0dda Update version to 2.11.2 (#63150)
242119c0dda is described below

commit 242119c0ddab0445dacbd7e7a38d00410e83ce32
Author: Jarek Potiuk <[email protected]>
AuthorDate: Mon Mar 9 10:25:16 2026 +0100

    Update version to 2.11.2 (#63150)
---
 README.md                                          | 14 +++++------
 RELEASE_NOTES.rst                                  | 27 ++++++++++++++++++++++
 airflow/__init__.py                                |  2 +-
 airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst |  6 ++---
 airflow/reproducible_build.yaml                    |  4 ++--
 contributing-docs/03_contributors_quick_start.rst  |  2 +-
 contributing-docs/05_pull_requests.rst             |  2 +-
 contributing-docs/testing/docker_compose_tests.rst |  6 ++---
 contributing-docs/testing/k8s_tests.rst            | 14 +++++------
 dev/breeze/doc/04_troubleshooting.rst              |  2 +-
 dev/breeze/src/airflow_breeze/global_constants.py  |  1 +
 .../executors/general.rst                          |  2 +-
 .../modules_management.rst                         | 20 ++++++++--------
 .../installation/supported-versions.rst            |  4 ++--
 docs/docker-stack/README.md                        | 14 +++++------
 docs/docker-stack/build.rst                        |  2 +-
 .../extending/add-airflow-configuration/Dockerfile |  2 +-
 .../extending/add-apt-packages/Dockerfile          |  2 +-
 .../add-build-essential-extend/Dockerfile          |  2 +-
 .../extending/add-providers/Dockerfile             |  2 +-
 .../add-pypi-packages-constraints/Dockerfile       |  2 +-
 .../extending/add-pypi-packages-uv/Dockerfile      |  2 +-
 .../extending/add-pypi-packages/Dockerfile         |  2 +-
 .../extending/add-requirement-packages/Dockerfile  |  2 +-
 .../extending/custom-providers/Dockerfile          |  2 +-
 .../extending/embedding-dags/Dockerfile            |  2 +-
 .../extending/writable-directory/Dockerfile        |  2 +-
 docs/docker-stack/entrypoint.rst                   | 14 +++++------
 docs/docker-stack/index.rst                        |  8 +++----
 generated/PYPI_README.md                           | 10 ++++----
 newsfragments/62533.bugfix.rst                     |  1 -
 newsfragments/62647.bugfix.rst                     |  1 -
 scripts/ci/pre_commit/supported_versions.py        |  4 ++--
 33 files changed, 104 insertions(+), 78 deletions(-)

diff --git a/README.md b/README.md
index fd04baae1f8..48f66f4abc5 100644
--- a/README.md
+++ b/README.md
@@ -98,7 +98,7 @@ Airflow is not a streaming solution, but it is often used to 
process real-time d
 
 Apache Airflow is tested with:
 
-|            | Main version (dev)                 | Stable version (3.1.7) | 
Stable version (2.11.1)      |
+|            | Main version (dev)                 | Stable version (3.1.8) | 
Stable version (2.11.2)      |
 
|------------|------------------------------------|------------------------|------------------------------|
 | Python     | 3.10, 3.11, 3.12, 3.13             | 3.10, 3.11, 3.12, 3.13 | 
3.10, 3.11, 3.12             |
 | Platform   | AMD64/ARM64                        | AMD64/ARM64            | 
AMD64/ARM64(\*)              |
@@ -170,15 +170,15 @@ them to the appropriate format and workflow that your 
tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.11.1' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.1/constraints-3.10.txt";
+pip install 'apache-airflow==2.11.2' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.2/constraints-3.10.txt";
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.11.1' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.1/constraints-3.10.txt";
+pip install 'apache-airflow[postgres,google]==2.11.2' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.2/constraints-3.10.txt";
 ```
 
 For information on installing provider packages, check
@@ -288,8 +288,8 @@ Apache Airflow version life cycle:
 
 | Version   | Current Patch/Minor   | State               | First Release   | 
Limited Maintenance   | EOL/Terminated   |
 
|-----------|-----------------------|---------------------|-----------------|-----------------------|------------------|
-| 3         | 3.1.7                 | Maintenance         | Apr 22, 2025    | 
TBD                   | TBD              |
-| 2         | 2.11.1                | Limited maintenance | Dec 17, 2020    | 
Oct 22, 2025          | Apr 22, 2026     |
+| 3         | 3.1.8                 | Maintenance         | Apr 22, 2025    | 
TBD                   | TBD              |
+| 2         | 2.11.2                | Limited maintenance | Dec 17, 2020    | 
Oct 22, 2025          | Apr 22, 2026     |
 | 1.10      | 1.10.15               | EOL                 | Aug 27, 2018    | 
Dec 17, 2020          | June 17, 2021    |
 | 1.9       | 1.9.0                 | EOL                 | Jan 03, 2018    | 
Aug 27, 2018          | Aug 27, 2018     |
 | 1.8       | 1.8.2                 | EOL                 | Mar 19, 2017    | 
Jan 03, 2018          | Jan 03, 2018     |
diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index fb686e99120..4609bc951f0 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -21,6 +21,33 @@
 
 .. towncrier release notes start
 
+Airflow 2.11.2 (2026-03-11)
+---------------------------
+
+Significant Changes
+^^^^^^^^^^^^^^^^^^^
+
+Bug Fixes
+"""""""""
+
+- Fix Task Instances list view rendering raw HTML instead of clickable links 
for Dag Id, Task Id, and Run Id columns. (#62533)
+- In 2.11.1 by mistake ``core.use_historical_filename_templates`` was read by 
Airflow instead of ``logging.use_historical_filename_templates``. The ``core`` 
option is deprecated in Airflow 2.11.2. Both options are removed in Airflow 3 
as historical templates are supported and does not cause low-severity security 
issue in Airflow 3. (#62647)
+- gracefully handle 404 from worker log server for historical retry attempts 
(#63002)
+- task_instance_mutation_hook receives a TI with run_id set (#62999)
+- fix missing logs in UI for tasks in ``UP_FOR_RETRY`` and 
``UP_FOR_RESCHEDULE`` states (#54547) (#62877)
+- Fixing 500 error on webserver after upgrading to FAB provider 1.5.4 (#62412)
+- Lazily import fs and package_index hook in providers manager #52117 (#62356)
+
+Updated dependencies
+""""""""""""""""""""
+
+- Bump the core-ui-package-updates group across 1 directory with 87 updates 
(#61091)
+- bump filelock (#62952)
+- Limit Celery Provider to not install 3.17.0 as it breaks airflow 2.11 
(#63046)
+- Bump the pip-dependency-updates group across 3 directories with 5 updates 
(#62808)
+- Upgrade to latest released build dependencies (#62613)
+
+
 Airflow 2.11.1 (2026-02-20)
 ---------------------------
 
diff --git a/airflow/__init__.py b/airflow/__init__.py
index 348c0616af9..a5466c08865 100644
--- a/airflow/__init__.py
+++ b/airflow/__init__.py
@@ -17,7 +17,7 @@
 # under the License.
 from __future__ import annotations
 
-__version__ = "2.11.1"
+__version__ = "2.11.2"
 
 import os
 import sys
diff --git a/airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst 
b/airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst
index 855f4224176..ab6d74ff741 100644
--- a/airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst
+++ b/airflow/providers/MANAGING_PROVIDERS_LIFECYCLE.rst
@@ -345,7 +345,7 @@ Example failing collection after ``google`` provider has 
been suspended:
     ImportError while importing test module 
'/opt/airflow/tests/providers/apache/beam/operators/test_beam.py'.
     Hint: make sure your test modules/packages have valid Python names.
     Traceback:
-    /usr/local/lib/python3.8/importlib/__init__.py:127: in import_module
+    /usr/local/lib/python3.10/importlib/__init__.py:127: in import_module
         return _bootstrap._gcd_import(name[level:], package, level)
     tests/providers/apache/beam/operators/test_beam.py:25: in <module>
         from airflow.providers.apache.beam.operators.beam import (
@@ -373,7 +373,7 @@ The fix is to add this line at the top of the 
``tests/providers/apache/beam/oper
       Traceback (most recent call last):
         File "/opt/airflow/scripts/in_container/verify_providers.py", line 
266, in import_all_classes
           _module = importlib.import_module(modinfo.name)
-        File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in 
import_module
+        File "/usr/local/lib/python3.10/importlib/__init__.py", line 127, in 
import_module
           return _bootstrap._gcd_import(name, package, level)
         File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
         File "<frozen importlib._bootstrap>", line 983, in _find_and_load
@@ -381,7 +381,7 @@ The fix is to add this line at the top of the 
``tests/providers/apache/beam/oper
         File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
         File "<frozen importlib._bootstrap_external>", line 728, in exec_module
         File "<frozen importlib._bootstrap>", line 219, in 
_call_with_frames_removed
-        File 
"/usr/local/lib/python3.8/site-packages/airflow/providers/mysql/transfers/s3_to_mysql.py",
 line 23, in <module>
+        File 
"/usr/local/lib/python3.10/site-packages/airflow/providers/mysql/transfers/s3_to_mysql.py",
 line 23, in <module>
           from airflow.providers.amazon.aws.hooks.s3 import S3Hook
       ModuleNotFoundError: No module named 'airflow.providers.amazon'
 
diff --git a/airflow/reproducible_build.yaml b/airflow/reproducible_build.yaml
index ded4f2c69b6..e28ff747419 100644
--- a/airflow/reproducible_build.yaml
+++ b/airflow/reproducible_build.yaml
@@ -1,2 +1,2 @@
-release-notes-hash: 80ae3bef06b3cffdfdd6a64e6442bc0e
-source-date-epoch: 1771642021
+release-notes-hash: 647fdaa1d742db7f83fc191ca1a82e71
+source-date-epoch: 1773003042
diff --git a/contributing-docs/03_contributors_quick_start.rst 
b/contributing-docs/03_contributors_quick_start.rst
index fe97a4c2ada..154ee68cb51 100644
--- a/contributing-docs/03_contributors_quick_start.rst
+++ b/contributing-docs/03_contributors_quick_start.rst
@@ -324,7 +324,7 @@ Using Breeze
       Use CI image.
 
    Branch name:            main
-   Docker image:           ghcr.io/apache/airflow/main/ci/python3.8:latest
+   Docker image:           ghcr.io/apache/airflow/main/ci/python3.10:latest
    Airflow source version: 2.4.0.dev0
    Python version:         3.8
    Backend:                mysql 5.7
diff --git a/contributing-docs/05_pull_requests.rst 
b/contributing-docs/05_pull_requests.rst
index 31f5b33a61c..0e4f1354e9e 100644
--- a/contributing-docs/05_pull_requests.rst
+++ b/contributing-docs/05_pull_requests.rst
@@ -92,7 +92,7 @@ these guidelines:
     you can push your code to PR and see results of the tests in the CI.
 
 -   You can use any supported python version to run the tests, but the best is 
to check
-    if it works for the oldest supported version (Python 3.8 currently). In 
rare cases
+    if it works for the oldest supported version (Python 3.10 currently). In 
rare cases
     tests might fail with the oldest version when you use features that are 
available in newer Python
     versions. For that purpose we have ``airflow.compat`` package where we 
keep back-ported
     useful features from newer versions.
diff --git a/contributing-docs/testing/docker_compose_tests.rst 
b/contributing-docs/testing/docker_compose_tests.rst
index 8ecdc071ff1..0cb0e9e3f25 100644
--- a/contributing-docs/testing/docker_compose_tests.rst
+++ b/contributing-docs/testing/docker_compose_tests.rst
@@ -65,8 +65,8 @@ to see the output of the test as it happens (it can be also 
set via
 The test can be also run manually with ``pytest 
docker_tests/test_docker_compose_quick_start.py``
 command, provided that you have a local airflow venv with ``dev`` extra set 
and the
 ``DOCKER_IMAGE`` environment variable is set to the image you want to test. 
The variable defaults
-to ``ghcr.io/apache/airflow/main/prod/python3.8:latest`` which is built by 
default
-when you run ``breeze prod-image build --python 3.9``. also the switches 
``--skip-docker-compose-deletion``
+to ``ghcr.io/apache/airflow/main/prod/python3.10:latest`` which is built by 
default
+when you run ``breeze prod-image build --python 3.10``. also the switches 
``--skip-docker-compose-deletion``
 and ``--wait-for-containers-timeout`` can only be passed via environment 
variables.
 
 If you want to debug the deployment using ``docker compose`` commands after 
``SKIP_DOCKER_COMPOSE_DELETION``
@@ -87,7 +87,7 @@ the prod image build command above.
 
 .. code-block:: bash
 
-    export AIRFLOW_IMAGE_NAME=ghcr.io/apache/airflow/main/prod/python3.8:latest
+    export 
AIRFLOW_IMAGE_NAME=ghcr.io/apache/airflow/main/prod/python3.10:latest
 
 and follow the instructions in the
 `Running Airflow in Docker 
<https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html>`_
diff --git a/contributing-docs/testing/k8s_tests.rst 
b/contributing-docs/testing/k8s_tests.rst
index 79ebba89bdc..3a96138b276 100644
--- a/contributing-docs/testing/k8s_tests.rst
+++ b/contributing-docs/testing/k8s_tests.rst
@@ -373,15 +373,15 @@ Should show the status of current KinD cluster.
 
 .. code-block:: text
 
-    Building the K8S image for Python 3.8 using airflow base image: 
ghcr.io/apache/airflow/main/prod/python3.8:latest
+    Building the K8S image for Python 3.10 using airflow base image: 
ghcr.io/apache/airflow/main/prod/python3.10:latest
 
     [+] Building 0.1s (8/8) FINISHED
      => [internal] load build definition from Dockerfile                       
                                                                                
                                                                                
                                                    0.0s
      => => transferring dockerfile: 301B                                       
                                                                                
                                                                                
                                                    0.0s
      => [internal] load .dockerignore                                          
                                                                                
                                                                                
                                                    0.0s
      => => transferring context: 35B                                           
                                                                                
                                                                                
                                                    0.0s
-     => [internal] load metadata for 
ghcr.io/apache/airflow/main/prod/python3.8:latest                               
                                                                                
                                                                                
              0.0s
-     => [1/3] FROM ghcr.io/apache/airflow/main/prod/python3.8:latest           
                                                                                
                                                                                
                                                    0.0s
+     => [internal] load metadata for 
ghcr.io/apache/airflow/main/prod/python3.10:latest                              
                                                                                
                                                                                
               0.0s
+     => [1/3] FROM ghcr.io/apache/airflow/main/prod/python3.10:latest          
                                                                                
                                                                                
                                                     0.0s
      => [internal] load build context                                          
                                                                                
                                                                                
                                                    0.0s
      => => transferring context: 3.00kB                                        
                                                                                
                                                                                
                                                    0.0s
      => CACHED [2/3] COPY airflow/example_dags/ /opt/airflow/dags/             
                                                                                
                                                                                
                                                    0.0s
@@ -389,7 +389,7 @@ Should show the status of current KinD cluster.
      => exporting to image                                                     
                                                                                
                                                                                
                                                    0.0s
      => => exporting layers                                                    
                                                                                
                                                                                
                                                    0.0s
      => => writing image 
sha256:c0bdd363c549c3b0731b8e8ce34153d081f239ee2b582355b7b3ffd5394c40bb         
                                                                                
                                                                                
                          0.0s
-     => => naming to 
ghcr.io/apache/airflow/main/prod/python3.8-kubernetes:latest
+     => => naming to 
ghcr.io/apache/airflow/main/prod/python3.10-kubernetes:latest
 
     NEXT STEP: You might now upload your k8s image by:
 
@@ -409,9 +409,9 @@ Should show the status of current KinD cluster.
     Good version of kubectl installed: 1.25.0 in 
/Users/jarek/IdeaProjects/airflow/.build/.k8s-env/bin
     Good version of helm installed: 3.9.2 in 
/Users/jarek/IdeaProjects/airflow/.build/.k8s-env/bin
     Stable repo is already added
-    Uploading Airflow image 
ghcr.io/apache/airflow/main/prod/python3.8-kubernetes to cluster 
airflow-python-3.8-v1.24.2
-    Image: "ghcr.io/apache/airflow/main/prod/python3.8-kubernetes" with ID 
"sha256:fb6195f7c2c2ad97788a563a3fe9420bf3576c85575378d642cd7985aff97412" not 
yet present on node "airflow-python-3.8-v1.24.2-worker", loading...
-    Image: "ghcr.io/apache/airflow/main/prod/python3.8-kubernetes" with ID 
"sha256:fb6195f7c2c2ad97788a563a3fe9420bf3576c85575378d642cd7985aff97412" not 
yet present on node "airflow-python-3.8-v1.24.2-control-plane", loading...
+    Uploading Airflow image 
ghcr.io/apache/airflow/main/prod/python3.10-kubernetes to cluster 
airflow-python-3.8-v1.24.2
+    Image: "ghcr.io/apache/airflow/main/prod/python3.10-kubernetes" with ID 
"sha256:fb6195f7c2c2ad97788a563a3fe9420bf3576c85575378d642cd7985aff97412" not 
yet present on node "airflow-python-3.8-v1.24.2-worker", loading...
+    Image: "ghcr.io/apache/airflow/main/prod/python3.10-kubernetes" with ID 
"sha256:fb6195f7c2c2ad97788a563a3fe9420bf3576c85575378d642cd7985aff97412" not 
yet present on node "airflow-python-3.8-v1.24.2-control-plane", loading...
 
     NEXT STEP: You might now deploy airflow by:
 
diff --git a/dev/breeze/doc/04_troubleshooting.rst 
b/dev/breeze/doc/04_troubleshooting.rst
index 901638c752b..85e5fa7ce6d 100644
--- a/dev/breeze/doc/04_troubleshooting.rst
+++ b/dev/breeze/doc/04_troubleshooting.rst
@@ -79,7 +79,7 @@ When you see this error:
 
 .. code-block::
 
-    ImportError: cannot import name 'cache' from 'functools' 
(/Users/jarek/Library/Application 
Support/hatch/pythons/3.8/python/lib/python3.8/functools.py)
+    ImportError: cannot import name 'cache' from 'functools' 
(/Users/jarek/Library/Application 
Support/hatch/pythons/3.8/python/lib/python3.10/functools.py)
 
 or
 
diff --git a/dev/breeze/src/airflow_breeze/global_constants.py 
b/dev/breeze/src/airflow_breeze/global_constants.py
index aab799864f3..8487e1bc54b 100644
--- a/dev/breeze/src/airflow_breeze/global_constants.py
+++ b/dev/breeze/src/airflow_breeze/global_constants.py
@@ -396,6 +396,7 @@ AIRFLOW_PYTHON_COMPATIBILITY_MATRIX = {
     "2.10.5": ["3.8", "3.9", "3.10", "3.11", "3.12"],
     "2.11.0": ["3.9", "3.10", "3.11", "3.12"],
     "2.11.1": ["3.10", "3.11", "3.12"],
+    "2.11.2": ["3.10", "3.11", "3.12"],
 }
 
 DB_RESET = False
diff --git a/docs/apache-airflow-providers-amazon/executors/general.rst 
b/docs/apache-airflow-providers-amazon/executors/general.rst
index 94d0248008a..0d11cbe1cc4 100644
--- a/docs/apache-airflow-providers-amazon/executors/general.rst
+++ b/docs/apache-airflow-providers-amazon/executors/general.rst
@@ -139,7 +139,7 @@ executor.) Apache Airflow images with specific python 
versions can be
 downloaded from the Dockerhub registry, and filtering tags by the
 `python
 version <https://hub.docker.com/r/apache/airflow/tags?page=1&name=3.8>`__.
-For example, the tag ``latest-python3.8`` specifies that the image will
+For example, the tag ``latest-python3.10`` specifies that the image will
 have python 3.8 installed.
 
 
diff --git 
a/docs/apache-airflow/administration-and-deployment/modules_management.rst 
b/docs/apache-airflow/administration-and-deployment/modules_management.rst
index dc6be49b1d4..2cb83312e62 100644
--- a/docs/apache-airflow/administration-and-deployment/modules_management.rst
+++ b/docs/apache-airflow/administration-and-deployment/modules_management.rst
@@ -58,9 +58,9 @@ by running an interactive terminal as in the example below:
     >>> pprint(sys.path)
     ['',
      '/home/arch/.pyenv/versions/3.8.4/lib/python37.zip',
-     '/home/arch/.pyenv/versions/3.8.4/lib/python3.8',
-     '/home/arch/.pyenv/versions/3.8.4/lib/python3.8/lib-dynload',
-     '/home/arch/venvs/airflow/lib/python3.8/site-packages']
+     '/home/arch/.pyenv/versions/3.8.4/lib/python3.10',
+     '/home/arch/.pyenv/versions/3.8.4/lib/python3.10/lib-dynload',
+     '/home/arch/venvs/airflow/lib/python3.10/site-packages']
 
 ``sys.path`` is initialized during program startup. The first precedence is
 given to the current directory, i.e, ``path[0]`` is the directory containing
@@ -237,7 +237,7 @@ specified by this command may be as follows:
 
 .. code-block:: none
 
-    Python PATH: 
[/home/rootcss/venvs/airflow/bin:/usr/lib/python38.zip:/usr/lib/python3.8:/usr/lib/python3.8/lib-dynload:/home/rootcss/venvs/airflow/lib/python3.8/site-packages:/home/rootcss/airflow/dags:/home/rootcss/airflow/config:/home/rootcss/airflow/plugins]
+    Python PATH: 
[/home/rootcss/venvs/airflow/bin:/usr/lib/python38.zip:/usr/lib/python3.10:/usr/lib/python3.10/lib-dynload:/home/rootcss/venvs/airflow/lib/python3.10/site-packages:/home/rootcss/airflow/dags:/home/rootcss/airflow/config:/home/rootcss/airflow/plugins]
 
 Below is the sample output of the ``airflow info`` command:
 
@@ -268,8 +268,8 @@ Below is the sample output of the ``airflow info`` command:
     Paths info
     airflow_home    | /root/airflow
     system_path     | 
/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
-    python_path     | 
/usr/local/bin:/opt/airflow:/files/plugins:/usr/local/lib/python38.zip:/usr/local/lib/python3.8:/usr/
-                    | 
local/lib/python3.8/lib-dynload:/usr/local/lib/python3.8/site-packages:/files/dags:/root/airflow/conf
+    python_path     | 
/usr/local/bin:/opt/airflow:/files/plugins:/usr/local/lib/python38.zip:/usr/local/lib/python3.10:/usr/
+                    | 
local/lib/python3.10/lib-dynload:/usr/local/lib/python3.10/site-packages:/files/dags:/root/airflow/conf
                     | ig:/root/airflow/plugins
     airflow_on_path | True
 
@@ -311,9 +311,9 @@ The ``sys.path`` variable will look like below:
     ['',
      '/home/arch/projects/airflow_operators'
      '/home/arch/.pyenv/versions/3.8.4/lib/python37.zip',
-     '/home/arch/.pyenv/versions/3.8.4/lib/python3.8',
-     '/home/arch/.pyenv/versions/3.8.4/lib/python3.8/lib-dynload',
-     '/home/arch/venvs/airflow/lib/python3.8/site-packages']
+     '/home/arch/.pyenv/versions/3.8.4/lib/python3.10',
+     '/home/arch/.pyenv/versions/3.8.4/lib/python3.10/lib-dynload',
+     '/home/arch/venvs/airflow/lib/python3.10/site-packages']
 
 As we can see that our provided directory is now added to the path, let's
 try to import the package now:
@@ -336,7 +336,7 @@ value as shown below:
 
 .. code-block:: none
 
-    Python PATH: 
[/home/arch/venv/bin:/home/arch/projects/airflow_operators:/usr/lib/python38.zip:/usr/lib/python3.8:/usr/lib/python3.8/lib-dynload:/home/arch/venv/lib/python3.8/site-packages:/home/arch/airflow/dags:/home/arch/airflow/config:/home/arch/airflow/plugins]
+    Python PATH: 
[/home/arch/venv/bin:/home/arch/projects/airflow_operators:/usr/lib/python38.zip:/usr/lib/python3.10:/usr/lib/python3.10/lib-dynload:/home/arch/venv/lib/python3.10/site-packages:/home/arch/airflow/dags:/home/arch/airflow/config:/home/arch/airflow/plugins]
 
 Creating a package in Python
 ----------------------------
diff --git a/docs/apache-airflow/installation/supported-versions.rst 
b/docs/apache-airflow/installation/supported-versions.rst
index 022b6829760..3a184c66745 100644
--- a/docs/apache-airflow/installation/supported-versions.rst
+++ b/docs/apache-airflow/installation/supported-versions.rst
@@ -29,8 +29,8 @@ Apache Airflow® version life cycle:
 =========  =====================  ===================  ===============  
=====================  ================
 Version    Current Patch/Minor    State                First Release    
Limited Maintenance    EOL/Terminated
 =========  =====================  ===================  ===============  
=====================  ================
-3          3.1.7                  Maintenance          Apr 22, 2025     TBD    
                TBD
-2          2.11.1                 Limited maintenance  Dec 17, 2020     Oct 
22, 2025           Apr 22, 2026
+3          3.1.8                  Maintenance          Apr 22, 2025     TBD    
                TBD
+2          2.11.2                 Limited maintenance  Dec 17, 2020     Oct 
22, 2025           Apr 22, 2026
 1.10       1.10.15                EOL                  Aug 27, 2018     Dec 
17, 2020           June 17, 2021
 1.9        1.9.0                  EOL                  Jan 03, 2018     Aug 
27, 2018           Aug 27, 2018
 1.8        1.8.2                  EOL                  Mar 19, 2017     Jan 
03, 2018           Jan 03, 2018
diff --git a/docs/docker-stack/README.md b/docs/docker-stack/README.md
index a4acf0518dc..51ad9cbd412 100644
--- a/docs/docker-stack/README.md
+++ b/docs/docker-stack/README.md
@@ -31,12 +31,12 @@ Every time a new version of Airflow is released, the images 
are prepared in the
 [apache/airflow DockerHub](https://hub.docker.com/r/apache/airflow)
 for all the supported Python versions.
 
-You can find the following images there (Assuming Airflow version `2.11.1`):
+You can find the following images there (Assuming Airflow version `2.11.2`):
 
-* `apache/airflow:latest` - the latest released Airflow image with default 
Python version (3.8 currently)
+* `apache/airflow:latest` - the latest released Airflow image with default 
Python version (3.10 currently)
 * `apache/airflow:latest-pythonX.Y` - the latest released Airflow image with 
specific Python version
-* `apache/airflow:2.11.1` - the versioned Airflow image with default Python 
version (3.8 currently)
-* `apache/airflow:2.11.1-pythonX.Y` - the versioned Airflow image with 
specific Python version
+* `apache/airflow:2.11.2` - the versioned Airflow image with default Python 
version (3.10 currently)
+* `apache/airflow:2.11.2-pythonX.Y` - the versioned Airflow image with 
specific Python version
 
 Those are "reference" regular images. They contain the most common set of 
extras, dependencies and providers that are
 often used by the users and they are good to "try-things-out" when you want to 
just take Airflow for a spin,
@@ -45,10 +45,10 @@ You can also use "slim" images that contain only core 
airflow and are about half
 but you need to add all the [Reference for package 
extras](https://airflow.apache.org/docs/apache-airflow/stable/extra-packages-ref.html)
 and providers that you need separately
 via [Building the 
image](https://airflow.apache.org/docs/docker-stack/build.html#build-build-image).
 
-* `apache/airflow:slim-latest`              - the latest released Airflow 
image with default Python version (3.8 currently)
+* `apache/airflow:slim-latest`              - the latest released Airflow 
image with default Python version (3.10 currently)
 * `apache/airflow:slim-latest-pythonX.Y`    - the latest released Airflow 
image with specific Python version
-* `apache/airflow:slim-2.11.1`           - the versioned Airflow image with 
default Python version (3.8 currently)
-* `apache/airflow:slim-2.11.1-pythonX.Y` - the versioned Airflow image with 
specific Python version
+* `apache/airflow:slim-2.11.2`           - the versioned Airflow image with 
default Python version (3.10 currently)
+* `apache/airflow:slim-2.11.2-pythonX.Y` - the versioned Airflow image with 
specific Python version
 
 The Apache Airflow image provided as convenience package is optimized for 
size, and
 it provides just a bare minimal set of the extras and dependencies installed 
and in most cases
diff --git a/docs/docker-stack/build.rst b/docs/docker-stack/build.rst
index 3317957f009..e85a175d482 100644
--- a/docs/docker-stack/build.rst
+++ b/docs/docker-stack/build.rst
@@ -215,7 +215,7 @@ In the simplest case building your image consists of those 
steps:
 
 1) Create your own ``Dockerfile`` (name it ``Dockerfile``) where you add:
 
-* information what your image should be based on (for example ``FROM: 
apache/airflow:|airflow-version|-python3.8``
+* information what your image should be based on (for example ``FROM: 
apache/airflow:|airflow-version|-python3.10``
 
 * additional steps that should be executed in your image (typically in the 
form of ``RUN <command>``)
 
diff --git 
a/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
 
b/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
index ef189f4692a..6e5d7e58a8c 100644
--- 
a/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
+++ 
b/docs/docker-stack/docker-examples/extending/add-airflow-configuration/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 ENV AIRFLOW__CORE__LOAD_EXAMPLES=True
 ENV AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=my_conn_string
 # [END Dockerfile]
diff --git 
a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile 
b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
index b5d8c08875d..2fce0154b8d 100644
--- a/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-apt-packages/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git 
a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
 
b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
index d6acfbfa204..fc7ebf09311 100644
--- 
a/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
+++ 
b/docs/docker-stack/docker-examples/extending/add-build-essential-extend/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git 
a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile 
b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
index 94d84d52cbf..bffaf3294a8 100644
--- a/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-providers/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 USER root
 RUN apt-get update \
   && apt-get install -y --no-install-recommends \
diff --git 
a/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile
 
b/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile
index 7934d39925e..68a65ae540d 100644
--- 
a/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile
+++ 
b/docs/docker-stack/docker-examples/extending/add-pypi-packages-constraints/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" lxml 
--constraint "${HOME}/constraints.txt"
 # [END Dockerfile]
diff --git 
a/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile 
b/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile
index 081cf2cec16..0c7e69c8521 100644
--- 
a/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile
+++ 
b/docs/docker-stack/docker-examples/extending/add-pypi-packages-uv/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 
 # The `uv` tools is Rust packaging tool that is much faster than `pip` and 
other installer
 # Support for uv as installation tool is experimental
diff --git 
a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile 
b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
index 5600d8fba88..0656560cc58 100644
--- a/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/add-pypi-packages/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" lxml
 # [END Dockerfile]
diff --git 
a/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
 
b/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
index aa3a0d20bf2..05d66a6d57d 100644
--- 
a/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
+++ 
b/docs/docker-stack/docker-examples/extending/add-requirement-packages/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 COPY requirements.txt /
 RUN pip install --no-cache-dir "apache-airflow==${AIRFLOW_VERSION}" -r 
/requirements.txt
 # [END Dockerfile]
diff --git 
a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile 
b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
index 1bf6d5ddffa..73d7764dc49 100644
--- a/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/custom-providers/Dockerfile
@@ -15,6 +15,6 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 RUN pip install "apache-airflow==${AIRFLOW_VERSION}" --no-cache-dir 
apache-airflow-providers-docker==2.5.1
 # [END Dockerfile]
diff --git 
a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile 
b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
index 631e235520d..8c883e4ed99 100644
--- a/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/embedding-dags/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 
 COPY --chown=airflow:root test_dag.py /opt/airflow/dags
 
diff --git 
a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile 
b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
index d1757131232..3985120ec3a 100644
--- a/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
+++ b/docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile
@@ -15,7 +15,7 @@
 
 # This is an example Dockerfile. It is not intended for PRODUCTION use
 # [START Dockerfile]
-FROM apache/airflow:2.11.1
+FROM apache/airflow:2.11.2
 RUN umask 0002; \
     mkdir -p ~/writeable-directory
 # [END Dockerfile]
diff --git a/docs/docker-stack/entrypoint.rst b/docs/docker-stack/entrypoint.rst
index 4d054fc020c..0444974159f 100644
--- a/docs/docker-stack/entrypoint.rst
+++ b/docs/docker-stack/entrypoint.rst
@@ -132,7 +132,7 @@ if you specify extra arguments. For example:
 
 .. code-block:: bash
 
-  docker run -it apache/airflow:2.11.1-python3.8 bash -c "ls -la"
+  docker run -it apache/airflow:2.11.2-python3.10 bash -c "ls -la"
   total 16
   drwxr-xr-x 4 airflow root 4096 Jun  5 18:12 .
   drwxr-xr-x 1 root    root 4096 Jun  5 18:12 ..
@@ -144,7 +144,7 @@ you pass extra parameters. For example:
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:2.11.1-python3.8 python -c "print('test')"
+  > docker run -it apache/airflow:2.11.2-python3.10 python -c "print('test')"
   test
 
 If first argument equals to "airflow" - the rest of the arguments is treated 
as an airflow command
@@ -152,13 +152,13 @@ to execute. Example:
 
 .. code-block:: bash
 
-   docker run -it apache/airflow:2.11.1-python3.8 airflow webserver
+   docker run -it apache/airflow:2.11.2-python3.10 airflow webserver
 
 If there are any other arguments - they are simply passed to the "airflow" 
command
 
 .. code-block:: bash
 
-  > docker run -it apache/airflow:2.11.1-python3.8 help
+  > docker run -it apache/airflow:2.11.2-python3.10 help
     usage: airflow [-h] GROUP_OR_COMMAND ...
 
     positional arguments:
@@ -363,7 +363,7 @@ database and creating an ``admin/admin`` Admin user with 
the following command:
     --env "_AIRFLOW_DB_MIGRATE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD=admin" \
-      apache/airflow:2.11.1-python3.8 webserver
+      apache/airflow:2.11.2-python3.10 webserver
 
 
 .. code-block:: bash
@@ -372,7 +372,7 @@ database and creating an ``admin/admin`` Admin user with 
the following command:
     --env "_AIRFLOW_DB_MIGRATE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
-      apache/airflow:2.11.1-python3.8 webserver
+      apache/airflow:2.11.2-python3.10 webserver
 
 The commands above perform initialization of the SQLite database, create admin 
user with admin password
 and Admin role. They also forward local port ``8080`` to the webserver port 
and finally start the webserver.
@@ -412,6 +412,6 @@ Example:
     --env "_AIRFLOW_DB_MIGRATE=true" \
     --env "_AIRFLOW_WWW_USER_CREATE=true" \
     --env "_AIRFLOW_WWW_USER_PASSWORD_CMD=echo admin" \
-      apache/airflow:2.11.1-python3.8 webserver
+      apache/airflow:2.11.2-python3.10 webserver
 
 This method is only available starting from Docker image of Airflow 2.1.1 and 
above.
diff --git a/docs/docker-stack/index.rst b/docs/docker-stack/index.rst
index 12ffa94ecae..288a7cf85c3 100644
--- a/docs/docker-stack/index.rst
+++ b/docs/docker-stack/index.rst
@@ -50,9 +50,9 @@ for all the supported Python versions.
 
 You can find the following images there (Assuming Airflow version 
:subst-code:`|airflow-version|`):
 
-* :subst-code:`apache/airflow:latest`              - the latest released 
Airflow image with default Python version (3.8 currently)
+* :subst-code:`apache/airflow:latest`              - the latest released 
Airflow image with default Python version (3.10 currently)
 * :subst-code:`apache/airflow:latest-pythonX.Y`    - the latest released 
Airflow image with specific Python version
-* :subst-code:`apache/airflow:|airflow-version|`           - the versioned 
Airflow image with default Python version (3.8 currently)
+* :subst-code:`apache/airflow:|airflow-version|`           - the versioned 
Airflow image with default Python version (3.10 currently)
 * :subst-code:`apache/airflow:|airflow-version|-pythonX.Y` - the versioned 
Airflow image with specific Python version
 
 Those are "reference" regular images. They contain the most common set of 
extras, dependencies and providers that are
@@ -62,9 +62,9 @@ You can also use "slim" images that contain only core airflow 
and are about half
 but you need to add all the :doc:`apache-airflow:extra-packages-ref` and 
providers that you need separately
 via :ref:`Building the image <build:build_image>`.
 
-* :subst-code:`apache/airflow:slim-latest`              - the latest released 
Airflow image with default Python version (3.8 currently)
+* :subst-code:`apache/airflow:slim-latest`              - the latest released 
Airflow image with default Python version (3.10 currently)
 * :subst-code:`apache/airflow:slim-latest-pythonX.Y`    - the latest released 
Airflow image with specific Python version
-* :subst-code:`apache/airflow:slim-|airflow-version|`           - the 
versioned Airflow image with default Python version (3.8 currently)
+* :subst-code:`apache/airflow:slim-|airflow-version|`           - the 
versioned Airflow image with default Python version (3.10 currently)
 * :subst-code:`apache/airflow:slim-|airflow-version|-pythonX.Y` - the 
versioned Airflow image with specific Python version
 
 The Apache Airflow image provided as convenience package is optimized for 
size, and
diff --git a/generated/PYPI_README.md b/generated/PYPI_README.md
index 102f70862a7..33ce4b42491 100644
--- a/generated/PYPI_README.md
+++ b/generated/PYPI_README.md
@@ -55,7 +55,7 @@ Use Airflow to author workflows (Dags) that orchestrate 
tasks. The Airflow sched
 
 Apache Airflow is tested with:
 
-|            | Main version (dev)                 | Stable version (3.1.7) | 
Stable version (2.11.1)      |
+|            | Main version (dev)                 | Stable version (3.1.8) | 
Stable version (2.11.2)      |
 
|------------|------------------------------------|------------------------|------------------------------|
 | Python     | 3.10, 3.11, 3.12, 3.13             | 3.10, 3.11, 3.12, 3.13 | 
3.10, 3.11, 3.12             |
 | Platform   | AMD64/ARM64                        | AMD64/ARM64            | 
AMD64/ARM64(\*)              |
@@ -123,15 +123,15 @@ them to the appropriate format and workflow that your 
tool requires.
 
 
 ```bash
-pip install 'apache-airflow==2.11.1' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.1/constraints-3.10.txt";
+pip install 'apache-airflow==2.11.2' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.2/constraints-3.10.txt";
 ```
 
 2. Installing with extras (i.e., postgres, google)
 
 ```bash
-pip install 'apache-airflow[postgres,google]==2.11.1' \
- --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.1/constraints-3.10.txt";
+pip install 'apache-airflow[postgres,google]==2.11.2' \
+ --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-2.11.2/constraints-3.10.txt";
 ```
 
 For information on installing provider packages, check
diff --git a/newsfragments/62533.bugfix.rst b/newsfragments/62533.bugfix.rst
deleted file mode 100644
index 5bddc275e32..00000000000
--- a/newsfragments/62533.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-Fix Task Instances list view rendering raw HTML instead of clickable links for 
Dag Id, Task Id, and Run Id columns.
diff --git a/newsfragments/62647.bugfix.rst b/newsfragments/62647.bugfix.rst
deleted file mode 100644
index 3ff5b13d9c8..00000000000
--- a/newsfragments/62647.bugfix.rst
+++ /dev/null
@@ -1 +0,0 @@
-In 2.11.1 by mistake ``core.use_historical_filename_templates`` was read by 
Airflow instead of ``logging.use_historical_filename_templates``. The ``core`` 
option is deprecated in Airflow 2.11.2. Both options are removed in Airflow 3 
as historical templates are supported and does not cause low-severity security 
issue in Airflow 3.
diff --git a/scripts/ci/pre_commit/supported_versions.py 
b/scripts/ci/pre_commit/supported_versions.py
index 4e12d2b8864..3e088ec0dc5 100755
--- a/scripts/ci/pre_commit/supported_versions.py
+++ b/scripts/ci/pre_commit/supported_versions.py
@@ -34,8 +34,8 @@ HEADERS = (
 )
 
 SUPPORTED_VERSIONS = (
-    ("3", "3.1.7", "Maintenance", "Apr 22, 2025", "TBD", "TBD"),
-    ("2", "2.11.1", "Limited maintenance", "Dec 17, 2020", "Oct 22, 2025", 
"Apr 22, 2026"),
+    ("3", "3.1.8", "Maintenance", "Apr 22, 2025", "TBD", "TBD"),
+    ("2", "2.11.2", "Limited maintenance", "Dec 17, 2020", "Oct 22, 2025", 
"Apr 22, 2026"),
     ("1.10", "1.10.15", "EOL", "Aug 27, 2018", "Dec 17, 2020", "June 17, 
2021"),
     ("1.9", "1.9.0", "EOL", "Jan 03, 2018", "Aug 27, 2018", "Aug 27, 2018"),
     ("1.8", "1.8.2", "EOL", "Mar 19, 2017", "Jan 03, 2018", "Jan 03, 2018"),


Reply via email to