potiuk opened a new issue #22264:
URL: https://github.com/apache/airflow/issues/22264


   ### Body
   
   ## Provider [alibaba: 
1.1.0rc1](https://pypi.org/project/apache-airflow-providers-alibaba/1.1.0rc1)
      - [ ] [Add oss_task_handler into alibaba-provider and enable remote 
logging to OSS (#21785)](https://github.com/apache/airflow/pull/21785): 
@EricGao888
   ## Provider [amazon: 
3.1.1rc1](https://pypi.org/project/apache-airflow-providers-amazon/3.1.1rc1)
      - [ ] [Added AWS RDS sensors 
(#21231)](https://github.com/apache/airflow/pull/21231): @kazanzhy
      - [ ] [Added AWS RDS operators 
(#20907)](https://github.com/apache/airflow/pull/20907): @kazanzhy
      - [ ] [Add RedshiftDataHook 
(#19137)](https://github.com/apache/airflow/pull/19137): @john-jac
      - [ ] [Feature: Add invoke lambda function operator 
(#21686)](https://github.com/apache/airflow/pull/21686): @schirag1993
      - [ ] [Add JSON output on SqlToS3Operator 
(#21779)](https://github.com/apache/airflow/pull/21779): @mariotaddeucci
      - [ ] [Implement a Sagemaker DeleteModelOperator and Delete model hook. 
(#21673)](https://github.com/apache/airflow/pull/21673): @hsrocks
      - [ ] [Added Hook for Amazon RDS 
(#20642)](https://github.com/apache/airflow/pull/20642): @kazanzhy
      - [ ] [Added SNS example DAG and rst 
(#21475)](https://github.com/apache/airflow/pull/21475): @ferruzzi
      - [ ] [retry on very specific eni provision failures 
(#22002)](https://github.com/apache/airflow/pull/22002): @zachliu
      - [ ] [Configurable AWS Session Factory 
(#21778)](https://github.com/apache/airflow/pull/21778): @ac1997
      - [ ] [S3KeySensor to use S3Hook url parser 
(#21500)](https://github.com/apache/airflow/pull/21500): @dstandish
      - [ ] [ECSOperator: Get log events after sleep to get all logs 
(#21574)](https://github.com/apache/airflow/pull/21574): @kanga333
      - [ ] [Use temporary file in GCSToS3Operator 
(#21295)](https://github.com/apache/airflow/pull/21295): @rafalh
      - [ ] [AWS RDS integration fixes 
(#22125)](https://github.com/apache/airflow/pull/22125): @kazanzhy
      - [ ] [Fix the Type Hints in ``RedshiftSQLOperator`` 
(#21885)](https://github.com/apache/airflow/pull/21885): @kaxil
      - [ ] [Bug Fix - S3DeleteObjectsOperator will try and delete all keys 
(#21458)](https://github.com/apache/airflow/pull/21458): @njrs92
      - [ ] [Fix Amazon SES emailer signature 
(#21681)](https://github.com/apache/airflow/pull/21681): @potiuk
      - [ ] [Fix EcsOperatorError, so it can be loaded from a picklefile 
(#21441)](https://github.com/apache/airflow/pull/21441): @ngwallace
      - [ ] [Fix RedshiftDataOperator and update doc 
(#22157)](https://github.com/apache/airflow/pull/22157): @vincbeck
      - [ ] [bugfix (#22137)](https://github.com/apache/airflow/pull/22137): 
@zachliu
      - [ ] [If uploading task logs to S3 fails, retry once 
(#21981)](https://github.com/apache/airflow/pull/21981): @steveyz-astro
      - [ ] [Bug-fix GCSToS3Operator 
(#22071)](https://github.com/apache/airflow/pull/22071): @rsg17
      - [ ] [Refactor query status polling logic in EMRContainerHook 
(#21423)](https://github.com/apache/airflow/pull/21423): @victorphoenix3
      - [ ] [use different logger to avoid duplicate log entry 
(#22256)](https://github.com/apache/airflow/pull/22256): @zachliu
      - [ ] [[doc] Improve s3 operator example by adding task upload_keys 
(#21422)](https://github.com/apache/airflow/pull/21422): @zhongjiajie
      - [ ] [Rename 'S3' hook name to 'Amazon S3' 
(#21988)](https://github.com/apache/airflow/pull/21988): @TreyYi
      - [ ] [Add template fields to DynamoDBToS3Operator 
(#22080)](https://github.com/apache/airflow/pull/22080): @eladkal
   ## Provider [databricks: 
2.4.0rc1](https://pypi.org/project/apache-airflow-providers-databricks/2.4.0rc1)
      - [ ] [Add new options to DatabricksCopyIntoOperator 
(#22076)](https://github.com/apache/airflow/pull/22076): @alexott
      - [ ] [Databricks hook - retry on HTTP Status 429 as well 
(#21852)](https://github.com/apache/airflow/pull/21852): @alexott
      - [ ] [Skip some tests for Databricks from running on Python 3.10 
(#22221)](https://github.com/apache/airflow/pull/22221): @potiuk
   ## Provider [docker: 
2.5.1rc1](https://pypi.org/project/apache-airflow-providers-docker/2.5.1rc1)
      - [ ] [Avoid trying to kill container when it did not succeed for Docker 
(#22145)](https://github.com/apache/airflow/pull/22145): @potiuk
   ## Provider [google: 
6.6.0rc1](https://pypi.org/project/apache-airflow-providers-google/6.6.0rc1)
      - [ ] [Support Uploading Bigger Files to Google Drive 
(#22179)](https://github.com/apache/airflow/pull/22179): @ulsc
      - [ ] [Make `chunk_size` in `GoogleDriveHook` standard 
(#22222)](https://github.com/apache/airflow/pull/22222): @ulsc
      - [ ] [Add guide for DataprocInstantiateInlineWorkflowTemplateOperator 
(#22062)](https://github.com/apache/airflow/pull/22062): @NiloFreitas
      - [ ] [Allow GCS Metadata to be included in GCS Upload 
(#22058)](https://github.com/apache/airflow/pull/22058): @patricker
      - [ ] [Dataplex operators 
(#20377)](https://github.com/apache/airflow/pull/20377): @wojsamjan
      - [ ] [Add support for ARM platform 
(#22127)](https://github.com/apache/airflow/pull/22127): @potiuk
      - [ ] [Use yaml safe load 
(#22091)](https://github.com/apache/airflow/pull/22091): @lwyszomi
   ## Provider [snowflake: 
2.5.2rc1](https://pypi.org/project/apache-airflow-providers-snowflake/2.5.2rc1)
      - [ ] [Remove Snowflake limits 
(#22181)](https://github.com/apache/airflow/pull/22181): @potiuk
   
   The guidelines on how to test providers can be found in
   
   [Verify providers by 
contributors](https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-by-contributors)
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to