This is an automated email from the ASF dual-hosted git repository.
potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/main by this push:
new ce9364e031d Manages cross-distribution dependencies better (#58430)
ce9364e031d is described below
commit ce9364e031db8eeb0d72cdbb3455e66396fa8cbb
Author: Jarek Potiuk <[email protected]>
AuthorDate: Wed Nov 19 02:05:03 2025 +0100
Manages cross-distribution dependencies better (#58430)
This PR addresses the case where we have to manage dependencies
between Airfow distributions and we have to manage the right lower
bind versions for them.
Since we missed a comprehensive documentation on how our dependencies
are managed in general, and we had no good place to describe it, this
PR not only adds code to manage it, but also extends the documentation
of the "12_airflw_dependencies_and_extras.rst" chapter of the
contributing documentation to include comprehensive (but also hopefully
helpful) documentation explaining to contributors how our dependencies
are managed.
The changes in the doc:
* add missing documentation explaining dependency management in airflow
distributions in the monorepo
* adds selective check to verify if cross-airflow-distribution
dependencies have not been modified in the PR, and fails if they were,
unless appropriate label is set on the PR
* adds breeze tests covering the test cases of selective check
---
.../docs/installation/installing-from-pypi.rst | 5 +-
.../docs/installation/installing-from-pypi.rst | 5 +-
.../13_airflow_dependencies_and_extras.rst | 275 ++++++++++++++++++++-
.../src/airflow_breeze/utils/selective_checks.py | 159 ++++++++++++
dev/breeze/tests/test_selective_checks.py | 166 +++++++++++++
providers-summary-docs/installing-from-pypi.rst | 2 +-
6 files changed, 597 insertions(+), 15 deletions(-)
diff --git a/airflow-core/docs/installation/installing-from-pypi.rst
b/airflow-core/docs/installation/installing-from-pypi.rst
index d2d9d6507d0..1815cd246a3 100644
--- a/airflow-core/docs/installation/installing-from-pypi.rst
+++ b/airflow-core/docs/installation/installing-from-pypi.rst
@@ -24,7 +24,7 @@ PyPI <https://pypi.org/project/apache-airflow/>`__.
Installation tools
''''''''''''''''''
-Only ``pip`` installation is currently officially supported.
+Only ``pip`` and ``uv`` installation is currently officially supported.
.. note::
@@ -33,7 +33,8 @@ Only ``pip`` installation is currently officially supported.
``pip`` - especially when it comes to constraint vs. requirements management.
Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If
you wish to install Airflow
using those tools you should use the constraints and convert them to
appropriate
- format and workflow that your tool requires.
+ format and workflow that your tool requires. Uv follows ``pip`` approach
+ with ``uv pip`` so it should work similarly.
Typical command to install Airflow from scratch in a reproducible way from
PyPI looks like below:
diff --git a/airflow-ctl/docs/installation/installing-from-pypi.rst
b/airflow-ctl/docs/installation/installing-from-pypi.rst
index cecd66c354e..9ec3a5e73ab 100644
--- a/airflow-ctl/docs/installation/installing-from-pypi.rst
+++ b/airflow-ctl/docs/installation/installing-from-pypi.rst
@@ -24,7 +24,7 @@ PyPI <https://pypi.org/project/apache-airflow-ctl/>`__.
Installation tools
''''''''''''''''''
-Only ``pip`` installation is currently officially supported.
+Only ``pip`` and ``uv`` installation is currently officially supported.
.. note::
@@ -33,7 +33,8 @@ Only ``pip`` installation is currently officially supported.
``pip`` - especially when it comes to constraint vs. requirements management.
Installing via ``Poetry`` or ``pip-tools`` is not currently supported. If
you wish to install airflow
using those tools you should use the constraints and convert them to
appropriate
- format and workflow that your tool requires.
+ format and workflow that your tool requires. Uv follows ``pip`` approach
+ with ``uv pip`` so it should work similarly.
There are known issues with ``bazel`` that might lead to circular
dependencies when using it to install
Airflow. Please switch to ``pip`` if you encounter such problems. ``Bazel``
community works on fixing
diff --git a/contributing-docs/13_airflow_dependencies_and_extras.rst
b/contributing-docs/13_airflow_dependencies_and_extras.rst
index 0b47ef675c4..28ebdb67ba5 100644
--- a/contributing-docs/13_airflow_dependencies_and_extras.rst
+++ b/contributing-docs/13_airflow_dependencies_and_extras.rst
@@ -18,6 +18,231 @@
Airflow dependencies
====================
+This document describes how we manage Apache Airflow dependencies, as well as
how we make sure
+that users can use Airflow both as an application and as a library when they
are deploying
+their own Airflow - using our constraints mechanism.
+
+Airflow ``pyproject.toml`` files and ``uv`` workspace
+.....................................................
+
+Managing dependencies is an important part of developing Apache Airflow, we
have more than 700
+dependencies, and often when you add a new feature that requires a new
dependency, you should
+also update the dependency list. This also happens when you want to add a new
tool that is used
+for development, testing, or building the documentation and this document
describes how to do it.
+
+When it comes to defining dependencies for any of the Apache Airflow
distributions, we are following the
+standard ``pyproject.toml`` format that has been defined in `PEP 518
<https://peps.python.org/pep-0518/>`_
+and `PEP 621 <https://peps.python.org/pep-0621/>`_, we are also using
dependency groups defined in
+`PEP 735 <https://peps.python.org/pep-0735/>`_ - particularly ``dev``
dependency group which we use in
+all our ``pyproject.toml`` files to define development dependencies.
+
+We have big number (currently more than 100) python distributions in Apache
Airflow repository - including the main
+``apache-airflow`` package, ``apache-airflow-providers-*`` packages,
``apache-airflow-core`` package,
+``apache-airflow-task-sdk`` package, ``apache-airflow-ctl`` package, and
several other packages.
+
+They are all connected together via ``uv`` workspace feature (workspace is
defined in the root ``pyproject.toml``
+file of the repository in the ``apache-airflow`` distribution definition. The
workspace feature allows us to
+run ``uv sync`` at the top of the repository to install all packages in
editable mode in the development
+environment from all the distributions and resolve the dependencies together,
so that we know that
+the dependencies have no conflicting requirements. Also the distributions are
referring to each other via
+name - which means that when you run locally ``uv sync``, the local version of
the packages are used, not the
+ones released on PyPI, which means that you can develop and test changes that
span multiple packages at the
+same time. This is a very powerful feature that allows us to maintain the
complex ecosystem of Apache Airflow
+distributions in a single monorepo, and allows us - for example to add new
feature to common distributions used
+by multiple providers and test them all together before releasing new versions
of either of those pacckages
+
+Managing dependencies in ``pyproject.toml`` files
+.................................................
+
+Each of the ``pyproject.toml`` files in Apache Airflow repository defines
dependencies in one of the
+following sections:
+
+* ``[project.dependencies]`` - this section defines the required dependencies
of the package. These
+ dependencies are installed when you install the package without any extras.
+* ``[project.optional-dependencies]`` - this section defines optional
dependencies (extras) of the package.
+ These dependencies are installed when you install the package with extras -
for example
+ ``pip install apache-airflow[ssh]`` will install the ``ssh`` extra
dependencies defined in this section.
+* ``[dependency-group.dev]`` - this section defines development dependencies
of the package.
+ These dependencies are installed when you run ``uv sync`` by default. when
``uv`` syncs sources with
+ local pyproject.toml it adds ``dev`` dependency group and package is
installed in editable mode with
+ development dependencies.
+
+
+Adding and modifying dependencies
+.................................
+
+Adding and modifying dependencies in Apache Airflow is done by modifying the
appropriate
+``pyproject.toml`` file in the appropriate distribution.
+
+When you add a new dependency, you should make sure that:
+
+* The dependency is added to the right section (main dependencies, optional
dependencies, or
+ development dependencies)
+
+* Some parts of those dependencies might be automatically generated (and
overwritten) by our ``prek``
+ hooks. Those are the necessary dependencies that ``prek`` hoks can figure
out automatically by
+ analyzing the imports in the sources and structure of the project. We also
have special case of
+ shared dependencies (described in `shared dependencies document
<../shared/README.md>`__) where we
+ effectively "static-link" some libraries into multiple distributions, to
avoid unnecessary coupling and
+ circular dependencies, and those dependencies contribute some dependencies
automatically as well.
+ Pay attention to comments such us example start and end of such generated
block below.
+ All the dependencies between those comments are automatically generated and
you should not modify
+ them manually. You might instead modify the root source of those
dependencies - depending on the automation,
+ you can usually also add extra dependencies manually outside of those
comment blocks
+
+ .. code-block:: python
+
+ # Automatically generated airflow optional dependencies
(update_airflow_pyproject_toml.py)
+ # ....
+ # LOTS OF GENERATED DEPENDENCIES HERE
+ # ....
+ # End of automatically generated airflow optional dependencies
+
+* The version specifier is as open as possible (upper) while still allowing
the package to install
+ and pass all tests. We very rarely upper-bind dependencies - only when there
is a known
+ conflict with a new or upcoming version of a dependency that breaks the
installation or tests
+ (and we always make a comment why we are upper-bounding a dependency).
+
+* Make sure to lower-bind any dependency you add. Usually we lower-bind
dependencies to the
+ minimum version that is required for the package to work but in order to
simplify the work of
+ resolvers such as ``pip``, we often lower-bind to higher (and newer) version
than the absolute minimum
+ especially when the minimum version is very old. This is for example good
practice in ``boto`` and related
+ packages, where new version of those packages are released frequently
(almost daily) and there are many
+ versions that need to be considered by the resolver if the version is not
new enough.
+
+* Make sure to run ``uv sync`` after modifying dependencies to make sure that
there are no
+ conflicts between dependencies of different packages in the workspace. You
can run it in multiple
+ ways - either from the root of the repository (which will sync all packages)
or from the package
+ you modified (which will sync only that package and its dependencies). Also
good idea might be to
+ run ``uv sync --all-packages --all-extras`` at the root of the repository to
make sure that
+ all packages with all extras can be installed together without conflicts,
but this might be sometimes
+ difficult and slow as some of the extras require some additional system
level dependencies to be installed
+ (for example ``mysql`` or ``postgres`` extras require client libraries to be
installed on the system).
+
+* Make sure to run all tests after modifying dependencies to make sure that
nothing is broken
+
+
+Referring to other Apache Airflow distributions in dependencies
+...............................................................
+
+With having more than distributions in the repository, it is often necessary
to refer to
+other distributions in the dependencies in order to use some common features
or simply to use the
+features that the other distribution provides. There are two ways of doing it:
+
+* Regular package linking with ``apache-airflow-*`` dependency
+* Airflow "shared dependencies" mechanism - which is a bit of a custom hack
for Airflow monorepo
+ that allows us to "static-link" some common dependencies into multiple
distributions without
+ creating circular dependencies.
+
+We are not going to describe the shared dependencies mechanism here, please
refer to the
+`shared dependencies document <../shared/README.md>`__ for details, but there
are certain rules
+when it comes to referring to other Airflow distributions in dependencies -
here are the important
+rules to remember:
+
+* You can refer to other distributions in your dependencies - as usual using
distribution name. For example,
+ if you are adding a dependency to ``apache-airflow-providers-common-compat``
package from
+ ``apache-airflow-providers-google``, you can just add
``apache-airflow-providers-common>=x.y.z`` to the
+ dependencies and when you run ``uv sync``, the local version of the package
will be used automatically
+ (this is thanks to the workspace feature of ``uv`` that does great job of
binding our monorepo together).
+ Some of those are added automatically by prek hooks - when it can detect
such dependencies by analyzing
+ imports in the sources - then they are added automatically between the
special comments mentioned above,
+ but sometimes (especially when such dependencies are not at the top-level
imports) you might need to
+ add them manually.
+
+* In case you make a feature change in a distribution and would like to update
its version, you should
+ never update the distribution version on your own. It is **entirely** up to
the Release Manager
+ to bump the version of distributions that are defined as
``project.version``. This goes almost without an
+ exception and any diversions from this rule should be discussed at
``#release-management`` channel in
+ Airflow Slack beforehand. The only exception to this rule is when you are
adding a new distribution to
+ the repository - in that case you should set the initial version of the
distribution - usually ``0.0.1``,
+ ``0.1.0`` or ``1.0.0`` depending on the maturity of the package. But still
you should discuss
+ it in the channel.
+
+* Sometimes, when you add a new feature to a common distribution, you might
add a feature to it or
+ change the API in the way that other packages can use it. This is especially
true for common packages such as
+ ``apache-airflow-providers-common-compat``, but can happen for other
packages (for example
+ ``apache-airflow-providers-apache-beam`` is used by
``apache-airflow-providers-google`` to use ``Apache Beam``
+ hooks to communicate with Google Dataflow). In such case, when you are
adding a feature to a common package
+ remember that the feature you just add will only be released in the
**FUTURE** release of such common
+ package and you cannot add ``>==x.y.z`` dependency to it where ``x.y.z`` is
the version you are
+ going to release in the future. Ultimately, this should happen (and happens)
when the Release Manager prepares
+ both packages together. Let us repeat - such changes in versions between
different airflow package should
+ **NOT** be added to the dependencies manually by the contributor. They
should **exclusively** be added by
+ the Release Manager. when preparing the release of **both** packages
together.
+ We have a custom mechanism to support such additions, where it is
contributor's responsibility to mark
+ dependency with a special comment - simply communicating with the Release
Manager that such dependency
+ should be updated to the next version when the release is prepared. If you
see such a need to use newly
+ added feature and using it at the same time in a different distribution -
make sure to add this comment
+ in the line where dependency you want to use the new feature from is defined:
+
+ .. warning::
+ You must use the exact comment ``# use next version`` - otherwise the
automation will not pick it up.
+
+ .. code-block:: python
+
+ # ...
+ "apache-airflow-SOMETHING>1.0.0", # use next version
+ # ...
+
+ We have tooling in place to:
+
+ - check that no regular PRs modify such cross-dependency versions
+ - make sure that such dependencies marked with the comment are updated
automatically to the next version,
+ when preparing the release
+
+ Example of such dependency (for example placed in
``apache-airflow-providers-google/pyproject.toml``) when
+ you want to use a new feature you are adding in the same PR to
``apache-airflow-providers-common-compat``
+ package:
+
+ .. code-block:: python
+
+ "apache-airflow-providers-google>=1.2.0",
+ "apache-airflow-providers-common-compat>=5.5.0", # use next version
+ "requests>=2.25.1",
+
+ Note that you **SHOULD NOT** change version of the common-compat package in
this PR. When the Release Manager
+ will prepare the release of both packages, the version will be updated
automatically to the next version
+ that is being released (for example ``5.6.0``) and the Release Manager will
make sure that both packages
+ are released together.
+
+* Some cross dependencies like that are added automatically by prek hooks as
well - when it can
+ detect such dependencies by analyzing imports in the sources - then they are
added automatically between
+ the special comments mentioned above. In such case the dependency line will
be added in the section that
+ is commented around with ``# Start automated ...``, ``# End automated ...``
comments as described above. In this case,
+ just copy the dependency line to outside of those comments and add the
+ ``# use next version`` comment to it, next time when ``prek hook`` will be
run it will remove the automatically
+ added line and keep only the manually added line with the comment.
+
+* Some of our dependencies have forced minimum version - mostly because of the
Airflow 3 minimum version
+ compatibility. Just in case in the future, we have other distributions
referring to them we are forcing a
+ minimum version for those distributions by a ``prek`` hook. This causes
entries like this:
+
+ .. code-block:: python
+
+ "apache-airflow-providers-google-something>=1.2.0",
+ "apache-airflow-providers-fab>=2.2.0", # Set from MIN_VERSION_OVERRIDE
in update_airflow_pyproject_toml.py
+ "requests>=2.25.1",
+
+ This will not happen, when the distribution will depend on even **newer**
version of the dependency,
+ but this is mainly a precaution in cases where we **know** we need a minimum
version for Airflow 3
+ compatibility (for example for ``git``, ``common.messaging`` and few other
providers) or where we
+ know that we should not use older version of providers in the future because
some functionality in them
+ stopped working (like in case of ``amazon``, ``fab``). You are free to
modify those versions to higher
+ versions if you need to, and ``prek`` will remove those comments
automatically.
+
+Our CI system will do all the tests for you anyway - including running some
lower-bind checks on dependencies.
+For example it will take each provider in a turn and will try to resolve
lowest-possible dependencies defined
+for that provider and see if the tests are still passing, so we should be
relatively protected against putting
+too low lower-bounds on dependencies, but running ``uv sync`` locally is still
a good idea to find such things
+before they hit the CI system.
+
+
+Airflow as both library and application - constraint files
+----------------------------------------------------------
+
+Why constraints?
+................
+
Airflow is not a standard python project. Most of the python projects fall
into one of two types -
application or library. As described in
`this StackOverflow question
<https://stackoverflow.com/questions/28509481/should-i-pin-my-python-dependencies-versions>`_,
@@ -33,19 +258,39 @@ you are developing your own operators and Dags.
This - seemingly unsolvable - puzzle is solved by having pinned constraints
files.
-**The outline for this document in GitHub is available at top-right corner
button (with 3-dots and 3 lines).**
+Why not using a standard approach?
+..................................
+
+Because the standards defined in Python environment have not yet caught up
with the needs of complex
+projects that are both libraries and applications.
+
+What we do is more of a hack to overcome the limitations of existing tools,
when it comes to
+reproducible installations of such projects.
+
+Discussion about standards advanced over the last few years with `PEP 751
<https://peps.python.org/pep-0751/>`_
+that introduces ``A file format to record Python dependencies for installation
reproducibility``,
+but support for this format (``pylock.toml``) is not yet widespread. As of
November 2025
+it is experimental in ``pip`` and supported as export format for ``uv`` from
their own lock (development
+environment focused) file. But the ability to use the format for reproducible
installation as intended
+in the ``PEP 751`` is not yet supported, not even ``PEP 751`` is complete
enough to allow that.
+
+The discussion is on-going on further advancing this mechanism with follow-up
PEP, to support such
+reproducible installation process that Airflow introduced years ago with the
constraints hack - see
+`Pre-PEP
<https://discuss.python.org/t/pre-pep-add-ability-to-install-a-package-with-reproducible-dependencies/99497/14>_`_
+where discussion about doing something like we do is on-going.
-Pinned constraint files
------------------------
+Pinned constraints files
+........................
.. note::
- Only ``pip`` installation is officially supported.
+ Only ``pip`` and ``uv`` installation is officially supported.
While it is possible to install Airflow with tools like `poetry
<https://python-poetry.org/>`_ or
`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the
same workflow as
``pip`` - especially when it comes to constraint vs. requirements
management.
- Installing via ``Poetry`` or ``pip-tools`` is not currently supported.
+ Installing via ``Poetry`` or ``pip-tools`` is not currently supported. Uv
follows ``pip`` approach
+ with ``uv pip`` so it should work similarly.
There are known issues with ``bazel`` that might lead to circular
dependencies when using it to install
Airflow. Please switch to ``pip`` if you encounter such problems. The
``Bazel`` community added support
@@ -128,12 +373,13 @@ if the tests are successful.
.. note::
- Only ``pip`` installation is currently officially supported.
+ Only ``pip`` and ``uv`` installation is currently officially supported.
While there are some successes with using other tools like `poetry
<https://python-poetry.org/>`_ or
`pip-tools <https://pypi.org/project/pip-tools/>`_, they do not share the
same workflow as
``pip`` - especially when it comes to constraint vs. requirements
management.
- Installing via ``Poetry`` or ``pip-tools`` is not currently supported.
+ Installing via ``Poetry`` or ``pip-tools`` is not currently supported. Uv
follows ``pip`` approach
+ with ``uv pip`` so it should work similarly.
There are known issues with ``bazel`` that might lead to circular
dependencies when using it to install
Airflow. Please switch to ``pip`` if you encounter such problems. ``Bazel``
community works on fixing
@@ -143,9 +389,8 @@ if the tests are successful.
If you wish to install Airflow using these tools you should use the
constraint files and convert
them to appropriate format and workflow that your tool requires.
-
-Optional dependencies (extras)
-------------------------------
+Optional dependencies (extras) of ``apache-airflow`` package
+............................................................
There are a number of extras that can be specified when installing Airflow.
Those
extras can be specified after the usual pip install - for example ``pip
install -e.[ssh]`` for editable
@@ -154,10 +399,20 @@ airflow as a user, but in ``editable`` mode you can also
install ``devel`` extra
you want to run Airflow locally for testing and ``doc`` extras that install
tools needed to build
the documentation.
+Note that some of those extras are only added in the meta-distribution
``apache-airflow``, they are not defined
+in the ``airflow-core`` and any other packages. The dependencies to providers
and copy of such extras from
+the ``apache-airflow`` package is done automatically by our ``prek`` hooks,
similarly as creating extras for
+providers from the ``apache-airflow-providers-*`` packages in the workspace.
+
+There are also some manually defined extras for optional features that are
often used with Airflow.
You can read more about those extras in the
`extras reference
<https://airflow.apache.org/docs/apache-airflow/stable/extra-packages-ref.html>`_.
+
+**The outline for this document in GitHub is available at top-right corner
button (with 3-dots and 3 lines).**
+
+
-----
You can now check how to update Airflow's `metadata database
<14_metadata_database_updates.rst>`__ if you need
diff --git a/dev/breeze/src/airflow_breeze/utils/selective_checks.py
b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
index 3cebaee4f2f..a70e4dab1ef 100644
--- a/dev/breeze/src/airflow_breeze/utils/selective_checks.py
+++ b/dev/breeze/src/airflow_breeze/utils/selective_checks.py
@@ -87,6 +87,7 @@ NON_COMMITTER_BUILD_LABEL = "non committer build"
UPGRADE_TO_NEWER_DEPENDENCIES_LABEL = "upgrade to newer dependencies"
USE_PUBLIC_RUNNERS_LABEL = "use public runners"
ALLOW_TRANSACTION_CHANGE_LABEL = "allow translation change"
+ALLOW_PROVIDER_DEPENDENCY_BUMP_LABEL = "allow provider dependency bump"
ALL_CI_SELECTIVE_TEST_TYPES = "API Always CLI Core Other Serialization"
ALL_PROVIDERS_SELECTIVE_TEST_TYPES = (
@@ -1527,3 +1528,161 @@ class SelectiveChecks:
)
sys.exit(1)
return _translation_changed
+
+ @cached_property
+ def provider_dependency_bump(self) -> bool:
+ """Check for apache-airflow-providers dependency bumps in
pyproject.toml files."""
+ pyproject_files = self._matching_files(
+ FileGroupForCi.ALL_PYPROJECT_TOML_FILES,
+ CI_FILE_GROUP_MATCHES,
+ )
+ if not pyproject_files or not self._commit_ref:
+ return False
+
+ try:
+ import tomllib
+ except ImportError:
+ import tomli as tomllib
+
+ violations = []
+ for pyproject_file in pyproject_files:
+ # Get the new version of the file
+ new_result = run_command(
+ ["git", "show", f"{self._commit_ref}:{pyproject_file}"],
+ capture_output=True,
+ text=True,
+ cwd=AIRFLOW_ROOT_PATH,
+ check=False,
+ )
+ if new_result.returncode != 0:
+ continue
+
+ # Get the old version of the file
+ old_result = run_command(
+ ["git", "show", f"{self._commit_ref}^:{pyproject_file}"],
+ capture_output=True,
+ text=True,
+ cwd=AIRFLOW_ROOT_PATH,
+ check=False,
+ )
+ if old_result.returncode != 0:
+ continue
+
+ try:
+ new_toml = tomllib.loads(new_result.stdout)
+ old_toml = tomllib.loads(old_result.stdout)
+ except Exception:
+ continue
+
+ # Check dependencies and optional-dependencies sections
+ for section in ["dependencies", "optional-dependencies"]:
+ if section not in new_toml.get("project", {}):
+ continue
+
+ new_deps = new_toml["project"][section]
+ old_deps = old_toml.get("project", {}).get(section, {})
+
+ if isinstance(new_deps, dict):
+ # Handle optional-dependencies which is a dict
+ for group_name, deps_list in new_deps.items():
+ old_deps_list = old_deps.get(group_name, []) if
isinstance(old_deps, dict) else []
+ violations.extend(
+ SelectiveChecks._check_provider_deps_in_list(
+ deps_list, old_deps_list, pyproject_file,
f"{section}.{group_name}"
+ )
+ )
+ elif isinstance(new_deps, list):
+ # Handle dependencies which is a list
+ old_deps_list = old_deps if isinstance(old_deps, list)
else []
+ violations.extend(
+ SelectiveChecks._check_provider_deps_in_list(
+ new_deps, old_deps_list, pyproject_file, section
+ )
+ )
+
+ if violations:
+ if ALLOW_PROVIDER_DEPENDENCY_BUMP_LABEL in self._pr_labels:
+ get_console().print(
+ "[warning]The 'allow provider dependency bump' label is
set. "
+ "Bypassing provider dependency check."
+ )
+ return True
+
+ get_console().print(
+ "[error]Provider dependency version bumps detected that should
only be "
+ "performed by Release Managers![/]"
+ )
+ get_console().print()
+ for violation in violations:
+ get_console().print(f"[error] - {violation}[/]")
+ get_console().print()
+ get_console().print(
+ "[warning]Only Release Managers should change >= conditions
for apache-airflow-providers "
+ "dependencies.[/]\n\nIf you want to refer to a future version
of the dependency, please add a "
+ "comment [info]'# use next version'[/info] in the line of the
dependency instead.\n"
+ )
+ get_console().print()
+ get_console().print(
+ f"[warning]If this change is intentional and approved, please
set the label on the PR:[/]\n\n"
+ f"'[info]{ALLOW_PROVIDER_DEPENDENCY_BUMP_LABEL}[/]\n"
+ )
+ get_console().print()
+ get_console().print(
+ "See
https://github.com/apache/airflow/blob/main/contributing-docs/"
+ "13_airflow_dependencies_and_extras.rst for more comprehensive
documentation "
+ "about airflow dependency management."
+ )
+ get_console().print()
+ sys.exit(1)
+ return False
+
+ @staticmethod
+ def _check_provider_deps_in_list(
+ new_deps: list, old_deps: list, file_path: str, section: str
+ ) -> list[str]:
+ """Check a list of dependencies for apache-airflow-providers version
changes."""
+ violations = []
+
+ # Parse dependencies into a dict for easier comparison
+ def parse_dep(dep_str: str) -> tuple[str, str | None]:
+ """Parse a dependency string and return (package_name,
version_constraint)."""
+ if not isinstance(dep_str, str):
+ return "", None
+ # Remove inline comments
+ dep_str = dep_str.split("#")[0].strip()
+ # Match patterns like: apache-airflow-providers-xxx>=1.0.0 or
apache-airflow-providers-xxx>=1.0.0,<2.0
+ match = re.match(r"^(apache-airflow-providers-[a-z0-9-]+)\s*(.*)",
dep_str, re.IGNORECASE)
+ if match:
+ return match.group(1).lower(), match.group(2).strip()
+ return "", None
+
+ old_deps_dict = {}
+ for dep in old_deps:
+ pkg_name, version = parse_dep(dep)
+ if pkg_name:
+ old_deps_dict[pkg_name] = (dep, version)
+
+ for new_dep in new_deps:
+ pkg_name, new_version = parse_dep(new_dep)
+ if not pkg_name:
+ continue
+
+ # Check if this dependency existed before
+ if pkg_name in old_deps_dict:
+ old_dep_str, old_version = old_deps_dict[pkg_name]
+ # Check if the >= condition changed
+ if new_version and old_version and new_version != old_version:
+ # Check if >= version number changed
+ new_ge_match = re.search(r">=\s*([0-9.]+)", new_version)
+ old_ge_match = re.search(r">=\s*([0-9.]+)", old_version)
+
+ if new_ge_match and old_ge_match:
+ new_ge_version = new_ge_match.group(1)
+ old_ge_version = old_ge_match.group(1)
+ if new_ge_version != old_ge_version:
+ violations.append(
+ f"{file_path} [{section}]: {pkg_name} >=
version changed from "
+ f"{old_ge_version} to {new_ge_version}"
+ )
+
+ return violations
diff --git a/dev/breeze/tests/test_selective_checks.py
b/dev/breeze/tests/test_selective_checks.py
index 46e8b61157a..8ab2f6f66cd 100644
--- a/dev/breeze/tests/test_selective_checks.py
+++ b/dev/breeze/tests/test_selective_checks.py
@@ -2779,3 +2779,169 @@ def
test_testable_providers_integrations_excludes_arm_disabled_on_arm():
assert "postgres" in result
assert "trino" not in result
assert "ydb" not in result
+
+
+@patch("airflow_breeze.utils.selective_checks.run_command")
+def test_provider_dependency_bump_check_no_changes(mock_run_command):
+ """Test that provider dependency bump check passes when no pyproject.toml
files are changed."""
+ selective_checks = SelectiveChecks(
+ files=("some_other_file.py",),
+ commit_ref=NEUTRAL_COMMIT,
+ pr_labels=(),
+ github_event=GithubEvents.PULL_REQUEST,
+ default_branch="main",
+ )
+ result = selective_checks.provider_dependency_bump
+ assert result is False
+
+
+@patch("airflow_breeze.utils.selective_checks.run_command")
+def
test_provider_dependency_bump_check_fails_on_provider_version_bump(mock_run_command):
+ """Test that provider dependency bump check fails when provider version is
bumped without label."""
+ old_toml = """
+[project]
+dependencies = [
+ "apache-airflow-providers-common-sql>=1.0.0",
+]
+"""
+ new_toml = """
+[project]
+dependencies = [
+ "apache-airflow-providers-common-sql>=1.1.0",
+]
+"""
+
+ def side_effect(*args, **kwargs):
+ result = Mock()
+ result.returncode = 0
+ if "^:" in args[0][2]:
+ result.stdout = old_toml
+ else:
+ result.stdout = new_toml
+ return result
+
+ mock_run_command.side_effect = side_effect
+
+ with pytest.raises(SystemExit):
+ SelectiveChecks(
+ files=("providers/amazon/pyproject.toml",),
+ commit_ref=NEUTRAL_COMMIT,
+ pr_labels=(),
+ github_event=GithubEvents.PULL_REQUEST,
+ default_branch="main",
+ ).provider_dependency_bump
+
+
+@patch("airflow_breeze.utils.selective_checks.run_command")
+def test_provider_dependency_bump_check_passes_with_label(mock_run_command):
+ """Test that provider dependency bump check passes when label is set."""
+ old_toml = """
+[project]
+dependencies = [
+ "apache-airflow-providers-common-sql>=1.0.0",
+]
+"""
+ new_toml = """
+[project]
+dependencies = [
+ "apache-airflow-providers-common-sql>=1.1.0",
+]
+"""
+
+ def side_effect(*args, **kwargs):
+ result = Mock()
+ result.returncode = 0
+ if "^:" in args[0][2]:
+ result.stdout = old_toml
+ else:
+ result.stdout = new_toml
+ return result
+
+ mock_run_command.side_effect = side_effect
+
+ selective_checks = SelectiveChecks(
+ files=("providers/amazon/pyproject.toml",),
+ commit_ref=NEUTRAL_COMMIT,
+ pr_labels=("allow provider dependency bump",),
+ github_event=GithubEvents.PULL_REQUEST,
+ default_branch="main",
+ )
+ result = selective_checks.provider_dependency_bump
+ assert result is True
+
+
+@patch("airflow_breeze.utils.selective_checks.run_command")
+def
test_provider_dependency_bump_check_passes_on_non_provider_dependency_changes(mock_run_command):
+ """Test that provider dependency bump check passes when non-provider
dependencies change."""
+ old_toml = """
+[project]
+dependencies = [
+ "apache-airflow>=2.10.0",
+ "boto3>=1.37.0",
+]
+"""
+ new_toml = """
+[project]
+dependencies = [
+ "apache-airflow>=2.10.0",
+ "boto3>=1.38.0",
+]
+"""
+
+ def side_effect(*args, **kwargs):
+ result = Mock()
+ result.returncode = 0
+ if "^:" in args[0][2]:
+ result.stdout = old_toml
+ else:
+ result.stdout = new_toml
+ return result
+
+ mock_run_command.side_effect = side_effect
+
+ selective_checks = SelectiveChecks(
+ files=("providers/amazon/pyproject.toml",),
+ commit_ref=NEUTRAL_COMMIT,
+ pr_labels=(),
+ github_event=GithubEvents.PULL_REQUEST,
+ default_branch="main",
+ )
+ result = selective_checks.provider_dependency_bump
+ assert result is False
+
+
+@patch("airflow_breeze.utils.selective_checks.run_command")
+def
test_provider_dependency_bump_check_in_optional_dependencies(mock_run_command):
+ """Test that provider dependency bump check works for
optional-dependencies section."""
+ old_toml = """
+[project.optional-dependencies]
+"cncf.kubernetes" = [
+ "apache-airflow-providers-cncf-kubernetes>=7.0.0",
+]
+"""
+ new_toml = """
+[project.optional-dependencies]
+"cncf.kubernetes" = [
+ "apache-airflow-providers-cncf-kubernetes>=7.2.0",
+]
+"""
+
+ def side_effect(*args, **kwargs):
+ result = Mock()
+ result.returncode = 0
+ if "^:" in args[0][2]:
+ result.stdout = old_toml
+ else:
+ result.stdout = new_toml
+ return result
+
+ mock_run_command.side_effect = side_effect
+
+ with pytest.raises(SystemExit):
+ _ = SelectiveChecks(
+ files=("providers/amazon/pyproject.toml",),
+ commit_ref=NEUTRAL_COMMIT,
+ pr_labels=(),
+ github_event=GithubEvents.PULL_REQUEST,
+ default_branch="main",
+ ).provider_dependency_bump
diff --git a/providers-summary-docs/installing-from-pypi.rst
b/providers-summary-docs/installing-from-pypi.rst
index e7ceb98161c..6dbe6c32338 100644
--- a/providers-summary-docs/installing-from-pypi.rst
+++ b/providers-summary-docs/installing-from-pypi.rst
@@ -24,7 +24,7 @@ PyPI <https://pypi.org/search/?q=apache-airflow-providers>`__.
Installation tools
''''''''''''''''''
-Only ``pip`` installation is currently officially supported.
+Only ``pip`` and ``uv`` installation is officially supported.
.. note::