potiuk opened a new issue #16456: URL: https://github.com/apache/airflow/issues/16456
I have a kind request for all the contributors to the latest provider packages release. Could you help us to test the RC versions of the providers and let us know in the comment, if the issue is addressed there. ## Providers that need testing Those are providers that require testing as there were some substantial changes introduced: ### Provider [airbyte: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-airbyte/2.0.0rc1) - [ ] [Add test_connection method to Airbyte hook (#16236)](https://github.com/apache/airflow/pull/16236): @msumit - [ ] [Fix hooks extended from http hook (#16109)](https://github.com/apache/airflow/pull/16109): @msumit ### Provider [amazon: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-amazon/2.0.0rc1) - [ ] [read timestamp from Cloudwatch events (#15173)](https://github.com/apache/airflow/pull/15173): @codenamestif - [ ] [remove retry for now (#16150)](https://github.com/apache/airflow/pull/16150): @zachliu - [ ] [Remove the `not-allow-trailing-slash` rule on S3_hook (#15609)](https://github.com/apache/airflow/pull/15609): @Isaacwhyuenac - [ ] [Add support of capacity provider strategy for ECSOperator (#15848)](https://github.com/apache/airflow/pull/15848): @codenamestif - [ ] [Update copy command for s3 to redshift (#16241)](https://github.com/apache/airflow/pull/16241): @sunki-hong - [ ] [Fix S3 Select payload join (#16189)](https://github.com/apache/airflow/pull/16189): @TAKEDA-Takashi - [ ] [Fix spacing in AwsBatchWaitersHook docstring (#15839)](https://github.com/apache/airflow/pull/15839): @jlaneve - [ ] [MongoToS3Operator failed when running with a single query (not aggregate pipeline) (#15680)](https://github.com/apache/airflow/pull/15680): @amatellanes - [ ] [fix: AwsGlueJobOperator change order of args for load_file (#16216)](https://github.com/apache/airflow/pull/16216): @avocadomaster ### Provider [apache.spark: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-spark/2.0.0rc1) - [ ] [Make SparkSqlHook use Connection (#15794)](https://github.com/apache/airflow/pull/15794): @uranusjr ### Provider [cncf.kubernetes: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/2.0.0rc1) - [ ] [Add KPO pod-template-file jinja template support. (#15942)](https://github.com/apache/airflow/pull/15942): @Dr-Denzy - [ ] [Save pod name to xcom for KubernetesPodOperator (#15755)](https://github.com/apache/airflow/pull/15755): @Junnplus - [ ] [Bug Fix Pod-Template Affinity Ignored due to empty Affinity K8S Object (#15787)](https://github.com/apache/airflow/pull/15787): @jpyen - [ ] [Bug Pod Template File Values Ignored (#16095)](https://github.com/apache/airflow/pull/16095): @jpyen - [ ] [Fix issue with parsing error logs in the KPO (#15638)](https://github.com/apache/airflow/pull/15638): @dimberman ### Provider [dingding: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-dingding/2.0.0rc1) - [ ] [Fix hooks extended from http hook (#16109)](https://github.com/apache/airflow/pull/16109): @msumit ### Provider [discord: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-discord/2.0.0rc1) - [ ] [Fix hooks extended from http hook (#16109)](https://github.com/apache/airflow/pull/16109): @msumit ### Provider [docker: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-docker/2.0.0rc1) - [ ] [Replace DockerOperator's 'volumes' arg for 'mounts' (#15843)](https://github.com/apache/airflow/pull/15843): @uranusjr ### Provider [elasticsearch: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-elasticsearch/2.0.0rc1) - [ ] [Support remote logging in elasticsearch with filebeat 7 (#14625)](https://github.com/apache/airflow/pull/14625): @jedcunningham ### Provider [google: 4.0.0rc1](https://pypi.org/project/apache-airflow-providers-google/4.0.0rc1) - [ ] [Move plyvel to google provider extra (#15812)](https://github.com/apache/airflow/pull/15812): @dstandish - [ ] [Fixes AzureFileShare connection extras (#16388)](https://github.com/apache/airflow/pull/16388): @potiuk - [ ] [Add extra links for google dataproc (#10343)](https://github.com/apache/airflow/pull/10343): @yesemsanthoshkumar - [ ] [Add link to Oracle Connection Docs (#15632)](https://github.com/apache/airflow/pull/15632): @sunkickr - [ ] [pass wait_for_done parameter down to _DataflowJobsController (#15541)](https://github.com/apache/airflow/pull/15541): @dejii - [ ] [Update Google Ads hook (#15266)](https://github.com/apache/airflow/pull/15266): @jacobhjkim - [ ] [Implement BigQuery Table Schema Update Operator (#15367)](https://github.com/apache/airflow/pull/15367): @thejens - [ ] [Add BigQueryToMsSqlOperator (#15422)](https://github.com/apache/airflow/pull/15422): @subkanthi - [ ] [Fix: GCS To BigQuery source_object (#16160)](https://github.com/apache/airflow/pull/16160): @tegardp - [ ] [FIX: unnecessary downloads in GCSToLocalFilesystemOperator (#16171)](https://github.com/apache/airflow/pull/16171): @p-kachalov - [ ] [Fix bigquery type error when export format is parquet (#16027)](https://github.com/apache/airflow/pull/16027): @sunki-hong - [ ] [Fix argument ordering and type of bucket and object (#15738)](https://github.com/apache/airflow/pull/15738): @sjvanrossum - [ ] [Fix sql_to_gcs docstring lint error (#15730)](https://github.com/apache/airflow/pull/15730): @natanweinberger - [ ] [Ensure `mysql_to_gcs` fully compatible with MySQL and BigQuery for `datetime`-related values (#15026)](https://github.com/apache/airflow/pull/15026): @tianjianjiang - [ ] [Fix deprecation warnings location in google provider (#16403)](https://github.com/apache/airflow/pull/16403): @ashb ### Provider [hashicorp: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-hashicorp/2.0.0rc1) - [ ] [Sanitize end of line character when loading token from a file (vault) (#16407)](https://github.com/apache/airflow/pull/16407): @mmenarguezpear ### Provider [http: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-http/2.0.0rc1) - [ ] [Update SimpleHttpOperator to take auth type object (#15605)](https://github.com/apache/airflow/pull/15605): @fredthomsen - [ ] [HttpHook. Use request factory and respect defaults (#14701)](https://github.com/apache/airflow/pull/14701): @ngaranko ### Provider [microsoft.azure: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-azure/3.0.0rc1) - [ ] [Fixes AzureFileShare connection extras (#16388)](https://github.com/apache/airflow/pull/16388): @potiuk - [ ] [Add link to Oracle Connection Docs (#15632)](https://github.com/apache/airflow/pull/15632): @sunkickr - [ ] [Fix WasbHook.delete_file broken when using prefix (#15637)](https://github.com/apache/airflow/pull/15637): @monti-python - [ ] [Fix colon spacing in AzureDataExplorerHook docstring (#15841)](https://github.com/apache/airflow/pull/15841): @jlaneve - [ ] [fix wasb remote logging when blob already exists (#16280)](https://github.com/apache/airflow/pull/16280): @flolas ### Provider [odbc: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-odbc/2.0.0rc1) - [ ] [OdbcHook returns None. Related to #15016 issue. (#15510)](https://github.com/apache/airflow/pull/15510): @Goodkat - [ ] [Fix OdbcHook handling of port (#15772)](https://github.com/apache/airflow/pull/15772): @dstandish ### Provider [opsgenie: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-opsgenie/2.0.0rc1) - [ ] [Fix hooks extended from http hook (#16109)](https://github.com/apache/airflow/pull/16109): @msumit ### Provider [oracle: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-oracle/2.0.0rc1) - [ ] [Add optional result handler to database hooks (#15581)](https://github.com/apache/airflow/pull/15581): @malthe - [ ] [[Oracle] Add port to DSN (#15589)](https://github.com/apache/airflow/pull/15589): @malthe - [ ] [Add link to Oracle Connection Docs (#15632)](https://github.com/apache/airflow/pull/15632): @sunkickr ### Provider [papermill: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-papermill/2.0.0rc1) - [ ] [Emit error on duplicated DAG ID (#15302)](https://github.com/apache/airflow/pull/15302): @uranusjr ### Provider [plexus: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-plexus/2.0.0rc1) - [ ] [Removes arrow higher limits for plexus provider (#16026)](https://github.com/apache/airflow/pull/16026): @potiuk ### Provider [postgres: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-postgres/2.0.0rc1) - [ ] [PostgresHook: deepcopy connection to avoid mutating connection obj (#15412)](https://github.com/apache/airflow/pull/15412): @zhzhang - [ ] [Avoid passing `aws_conn_id` as conn_args for `psycopg2.connect` (#16100)](https://github.com/apache/airflow/pull/16100): @gabrielsyapse ### Provider [qubole: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-qubole/2.0.0rc1) - [ ] [Qubole Hook Does Not Support 'include_headers' (#15598)](https://github.com/apache/airflow/issues/15598): @levyitay - [ ] [Feature qubole hook support headers (#15683)](https://github.com/apache/airflow/pull/15683): @levyitay - [ ] [Feature qubole hook support headers (#15615)](https://github.com/apache/airflow/pull/15615): @levyitay ### Provider [samba: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-samba/2.0.0rc1) - [ ] [Add support for extra parameters to samba client (#16115)](https://github.com/apache/airflow/pull/16115): @malthe ### Provider [sftp: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-sftp/2.0.0rc1) - [ ] [Depreciate private_key_pass extra param and rename to private_key_passphrase (#14028)](https://github.com/apache/airflow/pull/14028): @pgillet ### Provider [slack: 4.0.0rc1](https://pypi.org/project/apache-airflow-providers-slack/4.0.0rc1) - [ ] [Fix hooks extended from http hook (#16109)](https://github.com/apache/airflow/pull/16109): @msumit ### Provider [snowflake: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-snowflake/2.0.0rc1) - [ ] [Add `template_fields` to `S3ToSnowflake` operator (#15926)](https://github.com/apache/airflow/pull/15926): @nlecoy - [ ] [Allow S3ToSnowflakeOperator to omit schema (#15817)](https://github.com/apache/airflow/pull/15817): @uranusjr - [ ] [Added ability for Snowflake to attribute usage to Airflow by adding an application parameter (#16420)](https://github.com/apache/airflow/pull/16420): @sfc-gh-madkins - [ ] [fix: restore parameters support when sql passed to SnowflakeHook as str (#16102)](https://github.com/apache/airflow/pull/16102): @grassten ### Provider [ssh: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-ssh/2.0.0rc1) - [ ] [Fixed #9963: Display explicit error in case UID has no actual username (#15212)](https://github.com/apache/airflow/pull/15212): @andrewgodwin ## Providers that do not need testing Those are providers that were either doc-only or had changes that do not require testing. * Provider [apache.beam: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-beam/3.0.0rc1) * Provider [apache.cassandra: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-cassandra/2.0.0rc1) * Provider [apache.druid: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-druid/2.0.0rc1) * Provider [apache.hdfs: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-hdfs/2.0.0rc1) * Provider [apache.hive: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-hive/2.0.0rc1) * Provider [apache.kylin: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-kylin/2.0.0rc1) * Provider [apache.livy: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-livy/2.0.0rc1) * Provider [apache.pig: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-pig/2.0.0rc1) * Provider [apache.pinot: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-pinot/2.0.0rc1) * Provider [apache.sqoop: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-apache-sqoop/2.0.0rc1) * Provider [asana: 1.0.0rc1](https://pypi.org/project/apache-airflow-providers-asana/1.0.0rc1) * Provider [celery: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-celery/2.0.0rc1) * Provider [cloudant: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-cloudant/2.0.0rc1) * Provider [databricks: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-databricks/2.0.0rc1) * Provider [datadog: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-datadog/2.0.0rc1) * Provider [exasol: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-exasol/2.0.0rc1) * Provider [facebook: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-facebook/2.0.0rc1) * Provider [ftp: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-ftp/2.0.0rc1) * Provider [grpc: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-grpc/2.0.0rc1) * Provider [imap: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-imap/2.0.0rc1) * Provider [jdbc: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-jdbc/2.0.0rc1) * Provider [jenkins: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-jenkins/2.0.0rc1) * Provider [jira: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-jira/2.0.0rc1) * Provider [microsoft.mssql: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-mssql/2.0.0rc1) * Provider [microsoft.winrm: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-winrm/2.0.0rc1) * Provider [mongo: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-mongo/2.0.0rc1) * Provider [mysql: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-mysql/2.0.0rc1) * Provider [neo4j: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-neo4j/2.0.0rc1) * Provider [openfaas: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-openfaas/2.0.0rc1) * Provider [pagerduty: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-pagerduty/2.0.0rc1) * Provider [presto: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-presto/2.0.0rc1) * Provider [redis: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-redis/2.0.0rc1) * Provider [salesforce: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-salesforce/3.0.0rc1) * Provider [segment: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-segment/2.0.0rc1) * Provider [sendgrid: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-sendgrid/2.0.0rc1) * Provider [singularity: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-singularity/2.0.0rc1) * Provider [sqlite: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-sqlite/2.0.0rc1) * Provider [tableau: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-tableau/2.0.0rc1) * Provider [telegram: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-telegram/2.0.0rc1) * Provider [trino: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-trino/2.0.0rc1) * Provider [vertica: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-vertica/2.0.0rc1) * Provider [yandex: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-yandex/2.0.0rc1) * Provider [zendesk: 2.0.0rc1](https://pypi.org/project/apache-airflow-providers-zendesk/2.0.0rc1) <!-- NOTE TO RELEASE MANAGER: Please move here the providers that have doc-only changes or for which changes are trivial and you could asses that they are OK. In case The providers are automatically installed on Airflow 2.1 and latest `main` during the CI, so we know they are installable. Also all classes within the providers are imported during the CI run so we know all providers can be imported. --> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org