potiuk commented on code in PR #28006:
URL: https://github.com/apache/airflow/pull/28006#discussion_r1036339047


##########
airflow/providers/snowflake/hooks/snowflake.py:
##########
@@ -369,15 +369,19 @@ def run(
         with closing(self.get_conn()) as conn:
             self.set_autocommit(conn, autocommit)
 
-            # SnowflakeCursor does not extend ContextManager, so we have to 
ignore mypy error here
-            with closing(conn.cursor(DictCursor)) as cur:  # type: 
ignore[type-var]

Review Comment:
   BTW. This is precisely what I've done with Databricks last week. Breaking 
change of Databricks proivider with Hook returning "regular" records, but 
Operator returning Tuples of "description", "records" -  (this is how 
Databricks addressed the same problem. See 
https://airflow.apache.org/docs/apache-airflow-providers-databricks/stable/index.html#breaking-changes
   
   I will describe it in detail when I prepare release notes.
   
   It's not that bad as it might seem. People will still be able to downgrade 
to 3.* version of provider for mutliple months (and even keep it for as long as 
they want) - only when they will want to use new features of the provider, they 
will have to mighrate. I think this is **really** good deal :).  And the whole 
reason why we separated providers - so that we **can** make breaking changes 
when we want to move faster. Once we unite the DBApiHook behaviour (and add 
protection that it does not change lightly - this why all the tests and the  
https://github.com/apache/airflow/pull/27962 is for - all kind of lineaage, 
common tooling, amount of code to maintain for current and future sql providers 
will be much much better.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to