Dear Airflow Community,

I am thrilled to announce the availability of Apache Airflow 3.0.0.beta2
for testing! Airflow 3.0 marks a significant milestone as the first major
release in over four years, introducing improvements that enhance user
experience, task execution, and system scalability.

First, a few caveats:

This is a beta release, so do not run it in production. It may contain
significant issues, and you will likely need to reset your database between
this and subsequent beta or release candidate versions. (Consider yourself
warned!)

This release is intended for Airflow developers only to test the build and
start preparing for Airflow 3.0.0. This is not an official release—that
will happen when we create a release candidate and hold a vote. The
expected timeline for the first release candidate is the week of
2025-03-31, but we encourage early feedback to help stabilize the release.

What's new in Airflow 3?

Airflow 3.0.0 introduces significant enhancements and breaking changes.

Notable Features

DAG versioning & Bundles

Airflow now tracks DAG versions, offering better visibility into historical
DAG changes and execution states. The introduction of DAG Bundles ensures
tasks run with the correct code version, even as DAGs evolve.

Modern Web Application

The UI has been rebuilt using React and a complete API-driven structure,
improving maintainability and extensibility. It includes a new
component-based design system and an enhanced information architecture. A
new React-based plugin system supports custom widgets, improved workflow
visibility, and integration with external tools.

Task Execution Interface

Airflow 3.0 adopts a client / server architecture, decoupling task
execution from the internal meta-database via API-based interaction. This
allows for remote execution across networks, multi-language support,
enhanced security, and better dependency management. The Edge Executor
further enables seamless remote task execution without direct database
connections.

Data Assets & Asset-Centric Syntax

Airflow 3.0 enhances dataset management by introducing Data Assets,
expanding beyond tables and files to include ML models and more. Assets can
be explicitly defined using the @asset decorator, simplifying tracking and
dependencies.

External Event-Driven Scheduling

Airflow now supports event-driven DAG triggers from external sources like
message queues and blob stores. This builds upon dataset scheduling and
enhances integration with the external data ecosystem.

For a more comprehensive list of new features, please see the 3.0.0beta1
release notes:
https://github.com/apache/airflow/blob/3.0.0b2/RELEASE_NOTES.rst#airflow-3-0-0b2-2025-03-06
<https://github.com/apache/airflow/blob/3.0.0b1/RELEASE_NOTES.rst#airflow-3-0-0b1-2025-02-27>
For a list of the breaking changes, please see visit:
https://cwiki.apache.org/confluence/x/9pCMEw

Known limitations in 3.0.0.beta2:

   -

   AIP-72 - Task Execution Interface
   -

      The following capabilities are not yet supported in the beta: Skip
      based tasks (e.g. branch or skip operators) and Task callbacks.
      -

   AIP-38 - Modern Web Application
   -

      The new UI has limited functionality at this time and is still being
      enhanced until GA. However, feedback on the UX flow is
appreciated at this
      time.
      -

      The underlying FastAPI API server, including the new UI and public
      API, has limited auth + permissions.
      -

      Notable areas that are usable but not 100% complete: Backfills,
      Connections, Assets, Dag Versioning.



Where to get it? The alpha snapshot is available at:
https://dist.apache.org/repos/dist/dev/airflow/3.0.0b2/

   -

   apache-airflow-3.0.0b2-bin.tar.gz: Binary Python "sdist" snapshot.
   -

   apache_airflow-3.0.0b2-py3-none-any.whl: Binary Python wheel snapshot.


This snapshot has not been published to PyPI.

Also present are beta releases for apache-airflow-task-sdk,
apache-airflow-providers-standard, apache-airflow-providers-fab,
apache-airflow-providers-cncf-kubernetes and
apache-airflow-providers-celery, which you will also need for 3.0.0b1 to
work. These have also not been published to PyPI.

Public Keys & Verification

Public keys for verification are available at:
https://www.apache.org/dist/airflow/KEYS

Instructions:

You can build a virtualenv that installs this beta, and other required
packages (e.g. task sdk), like this:

```
uv venv

uv pip install --find-links
https://dist.apache.org/repos/dist/dev/airflow/3.0.0b2/ \
apache-airflow==3.0.0b2 \
apache-airflow-task-sdk==1.0.0b2 \
apache-airflow-providers-celery==3.11.0b2 \
apache-airflow-providers-cncf-kubernetes==10.4.0b2 \
apache-airflow-providers-fab==2.0.0b2 \
apache-airflow-providers-openlineage==2.1.1b2 \
apache-airflow-providers-standard==1.0.0b2
```

Below are some of the changes you’ll need to consider in order to run this
beta release:

   -

   The standalone DAG processor is now required, and can be started with
   `airflow dag-processor`.
   -

   The new UI and public API is started by running `airflow api-server`,
   and the UI is available on port `8080`.
   -

   Depending on your deployment setup, you may need to set the `[workers]
   execution_api_server_url` config option. This defaults to `
   http://localhost:8080/execution/` <http://localhost:8080/execution/>.

Get Involved

We encourage the community to test this release and report any issues or
feedback. Your contributions help us ensure a stable and reliable Airflow
3.0.0 release. Please report issues using Github at
https://github.com/apache/airflow/issues and mark that this is an issue in
3.0.0. For an updated list of all known issues in the beta can also be
found in the above link with the label “affected_version:3.0.0beta”.

A huge thank you to all the contributors who have worked on this milestone
release!

Thanks,
Jed

Reply via email to