Remmertmuller opened a new issue, #30236: URL: https://github.com/apache/airflow/issues/30236
### Apache Airflow Provider(s) snowflake ### Versions of Apache Airflow Providers apache-airflow-providers-amazon==7.1.0 apache-airflow-providers-celery==3.1.0 apache-airflow-providers-cncf-kubernetes==5.1.1 apache-airflow-providers-common-sql==1.3.3 apache-airflow-providers-docker==3.4.0 apache-airflow-providers-elasticsearch==4.3.3 apache-airflow-providers-ftp==3.3.0 apache-airflow-providers-google==8.8.0 apache-airflow-providers-grpc==3.1.0 apache-airflow-providers-hashicorp==3.2.0 apache-airflow-providers-http==4.1.1 apache-airflow-providers-imap==3.1.1 apache-airflow-providers-jdbc==3.3.0 apache-airflow-providers-microsoft-azure==5.1.0 apache-airflow-providers-microsoft-mssql==3.3.2 apache-airflow-providers-mysql==4.0.0 apache-airflow-providers-odbc==3.2.1 apache-airflow-providers-oracle==3.6.0 apache-airflow-providers-postgres==5.4.0 apache-airflow-providers-redis==3.1.0 apache-airflow-providers-sendgrid==3.1.0 apache-airflow-providers-sftp==4.2.1 apache-airflow-providers-slack==7.2.0 apache-airflow-providers-snowflake==4.0.2 apache-airflow-providers-sqlite==3.3.1 apache-airflow-providers-ssh==3.4.0 ### Apache Airflow version 2.5.1 ### Operating System Debian GNU/Linux 11 (bullseye) ### Deployment Other Docker-based deployment ### Deployment details _No response_ ### What happened We use the SnowflakeHook to execute stored procedures and then fetch the result for further logic. On the hook we call the `get_first()` method from the DbApiHook. When upgrading the Snowflake provider to version 4.0.2 we got the same errors as explained in [this #issue](https://github.com/apache/airflow/issues/30010). It is made possible to configure session_parameters on the connection or the hook but these are not being set when the connection is made or they overwritten when the `run()` method on the SnowflakeHook is executed. ### What you think should happen instead I would like to see that session_parameters (in my case AUTOCOMMIT) are set when creating the connection or instantiating the hook. And that when AUTOCOMMIT is set in the session_parameters, then the `run()` method on the hook should not set it back to the default value False. ### How to reproduce First I tested putting a autocommit session parameter on the connection. I made a snowflake connection with in the extra's field `{"session_parameters": {"AUTOCOMMIT": "True"}}`. Then when I run the following task: ``` @task(task_id="snowflake_test_autocommit") def snowflake_test_autocommit(): hook = SnowflakeHook("snowflake_autocommit") return hook.get_first(sql="SELECT '*** statement successfully executed'") ``` I see in the logs that autocommit is set to False: ``` [2023-03-22, 10:35:36 UTC] {{cursor.py:727}} INFO - query: [ALTER SESSION SET autocommit=False] ``` The same happens when setting the session parameter on instantiating the hook: ``` @task(task_id="snowflake_test_autocommit_on_hook") def snowflake_test_autocommit_on_hook(): hook = SnowflakeHook("snowflake", session_parameters={"AUTOCOMMIT": True}) return hook.get_first(sql="SELECT '*** statement successfully executed'") ``` ### Anything else _No response_ ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
