I'm happy to announce that new versions of Airflow Providers packages
were just released.

Those are mostly released to rectify the problem with accidentally
adding gitpython and wheel as dependency for all providers (but there
are also a few bugfixes - notably cncf.kubernetes and elasticsearch
fixes for Airflow 2.1 compatibility.

Detailed list of the changes not concerning the gitpython/wheel dependencies:

Provider amazon: 3.2.0
* Add more filter to s3 hook list_key (#22231)
* ImapAttachmentToS3Operator: fix it, update sample dag and update doc(#22351)

Provider apache.beam: 3.3.0
*  Add recipe for BeamRunGoPipelineOperator (#22296): @pierrejeambrun

Provider cncf.kubernetes: 3.1.2
* Fix "run_id" k8s and elasticsearch compatibility with Airflow 2.1 (#22385)
* Remove RefreshConfiguration workaround for K8s token refreshing (#20759)

Provider databricks: 2.5.0
* Operator for updating Databricks Repos (#22278)

Provider docker: 2.5.2
* Correct multiple_outputs param descriptions mentioning lists/tuple (#22371)

Provider elasticsearch: 3.0.2
* Fix "run_id" k8s and elasticsearch compatibility with Airflow 2.1 (#22385)

Provider google: 6.7.0
* Add dataflow_default_options to templated_fields (#22367)
* Add LocalFilesystemToGoogleDriveOperator (#22219)
* Add timeout and retry to the BigQueryInsertJobOperator (#22395)
* Fix #21989 indentation. A test is added to confirm job is executed
on… (#22302)
* [FIX] typo doc of gcs operator (#22290)

Provider postgres: 4.1.0
* Add ability to pass config parameters to postgres operator (#21551)

Provider snowflake: 2.6.0
* Add support for private key in connection for Snowflake (#22266)

You can install the providers via PyPI
https://airflow.apache.org/docs/apache-airflow-providers/installing-from-pypi

The documentation is available at https://airflow.apache.org/docs/ and
linked from the PyPI packages.

Cheers,
J.

Reply via email to