Hey all,

I have just cut the new wave Airflow Providers packages. This email is
calling a vote on the release,
which will last for 72 hours - which means that it will end on Sat 13 Aug
1pm CEST 2022.

Consider this my (binding) +1.

*Summary of this wave*

Apart of "regular" bugfix/features release, there are few notable things.

There is a new "major" release of Amazon provider with some backwards
incompatibilities and a lot of deprecations removed as well as getting rid
of a lot of duplications and refactorings that will make Amazon integration
easier to extend and test in automated way when we get to AIP-47 (big
shout-out to Vinc, Dennis and the whole Amazon team here).

Similarly Google provider had a lot of refactorings and changes - a lot of
them under the hood and preparing AIP-47 to be released but also a number
of improvements, fixes and new operators (Wojciech and Lukasz are leading
the pack there)

Also Databricks provider got a bunch of new features and fixes (here Alex
is the main contributor).

We have a lot more functionality moved to the relatively new "common.sql"
provider (which now becomes our "everything-sql", including a bug fix that
made it more difficult to use new SQL-providers in generic data transfer
operators. The "common.sql" provider is a key to get Lineage information
faster in the hands of all Airflow version users. Dmytro and gebo (?) :)
are the key players here.

Based on the learning with Common SQL we might introduce more "common"
providers in the future.

Connected to that we have major (slighlty backwards-inompatible) releases
of presto, trino, exasol and hive providers as we are standardizing the SQL
interface, method, parameter names and reusign more of the common
functionality of the "common.sql" across multiple provders - less
duplicated code, more common functionality is the main theme there.

There are lot more smaller changes (see the status issue for details)

--------------------------------------------------------------------------

Airflow Providers are available at:
https://dist.apache.org/repos/dist/dev/airflow/providers/

*apache-airflow-providers-<PROVIDER>-*.tar.gz* are the binary
 Python "sdist" release - they are also official "sources" for the provider
packages.

*apache_airflow_providers_<PROVIDER>-*.whl are the binary
 Python "wheel" release.

The test procedure for PMC members who would like to test the RC candidates
are described in
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-by-pmc-members

and for Contributors:

https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-by-contributors


Public keys are available at:
https://dist.apache.org/repos/dist/release/airflow/KEYS

Please vote accordingly:

[ ] +1 approve
[ ] +0 no opinion
[ ] -1 disapprove with the reason


Only votes from PMC members are binding, but members of the community are
encouraged to test the release and vote with "(non-binding)".

Please note that the version number excludes the 'rcX' string.
This will allow us to rename the artifact without modifying
the artifact checksums when we actually release.

The status of testing the providers by the community is kept here:

https://github.com/apache/airflow/issues/25634

You can find packages as well as detailed changelog following the below
links:

https://pypi.org/project/apache-airflow-providers-amazon/5.0.0rc1/
https://pypi.org/project/apache-airflow-providers-apache-drill/2.2.0rc1/
https://pypi.org/project/apache-airflow-providers-apache-druid/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-apache-hdfs/3.1.0rc1/
https://pypi.org/project/apache-airflow-providers-apache-hive/4.0.0rc1/
https://pypi.org/project/apache-airflow-providers-apache-livy/3.1.0rc1/
https://pypi.org/project/apache-airflow-providers-apache-pinot/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/4.3.0rc1/
https://pypi.org/project/apache-airflow-providers-common-sql/1.1.0rc1/
https://pypi.org/project/apache-airflow-providers-databricks/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-dbt-cloud/2.1.0rc1/
https://pypi.org/project/apache-airflow-providers-elasticsearch/4.2.0rc1/
https://pypi.org/project/apache-airflow-providers-exasol/4.0.0rc1/
https://pypi.org/project/apache-airflow-providers-google/8.3.0rc1/
https://pypi.org/project/apache-airflow-providers-hashicorp/3.1.0rc1/
https://pypi.org/project/apache-airflow-providers-jdbc/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-microsoft-azure/4.2.0rc1/
https://pypi.org/project/apache-airflow-providers-microsoft-mssql/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-mysql/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-neo4j/3.1.0rc1/
https://pypi.org/project/apache-airflow-providers-odbc/3.1.1rc1/
https://pypi.org/project/apache-airflow-providers-oracle/3.3.0rc1/
https://pypi.org/project/apache-airflow-providers-postgres/5.2.0rc1/
https://pypi.org/project/apache-airflow-providers-presto/4.0.0rc1/
https://pypi.org/project/apache-airflow-providers-qubole/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-salesforce/5.1.0rc1/
https://pypi.org/project/apache-airflow-providers-snowflake/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-sqlite/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-trino/4.0.0rc1/
https://pypi.org/project/apache-airflow-providers-vertica/3.2.0rc1/
https://pypi.org/project/apache-airflow-providers-yandex/3.1.0rc1/

Cheers,
J.

Reply via email to