dstandish commented on a change in pull request #11350:
URL: https://github.com/apache/airflow/pull/11350#discussion_r550005187
##########
File path: airflow/providers/snowflake/hooks/snowflake.py
##########
@@ -142,3 +141,18 @@ def set_autocommit(self, conn, autocommit: Any) -> None:
def get_autocommit(self, conn):
return getattr(conn, 'autocommit_mode', False)
+
+ def run(self, sql, autocommit=False, parameters=None):
+ """
+ Snowflake-connector doesn't allow natively the execution of multiple
SQL statements in the same
+ call. So for allowing to pass files or strings with several queries
this method is coded,
+ that relies on run from DBApiHook
+ """
+ if isinstance(sql, str):
+ with closing(self.get_conn()) as conn:
Review comment:
you do not need to wrap `SnowflakeConnection` with `closing` because it
already has a good `__exit__`
##########
File path: airflow/providers/snowflake/hooks/snowflake.py
##########
@@ -142,3 +141,18 @@ def set_autocommit(self, conn, autocommit: Any) -> None:
def get_autocommit(self, conn):
return getattr(conn, 'autocommit_mode', False)
+
+ def run(self, sql, autocommit=False, parameters=None):
+ """
+ Snowflake-connector doesn't allow natively the execution of multiple
SQL statements in the same
+ call. So for allowing to pass files or strings with several queries
this method is coded,
+ that relies on run from DBApiHook
+ """
+ if isinstance(sql, str):
Review comment:
ok so you want to run with execute_string if `str` and use dbapi.run
otherwise (e.g. if it is `List[str]`)
it seems reasonable. but another option would be to use snowflake's
`split_statements` function
https://github.com/snowflakedb/snowflake-connector-python/blob/master/src/snowflake/connector/util_text.py#L32
what you could do is always split and then you can always dbapi run. this
would save you the type checking and autocommit logic.
but i think you can reasonably go either way.
one thought though... at my prev company we liked to execute sql files.
what we did was first check if `sql` was a file that exists with
`Path(sql).exists()` and if so then `sql = Path(sql).read_text()`. then
execute as normal. you can do with templating but i find this easier. anyway
it's an idea for your consideration if you want to make this a little more
generic --- `Union[Path, List[str], str]` where str can be either single
statement, multiple statement, or path to sql file.
##########
File path: airflow/providers/snowflake/hooks/snowflake.py
##########
@@ -142,3 +141,18 @@ def set_autocommit(self, conn, autocommit: Any) -> None:
def get_autocommit(self, conn):
return getattr(conn, 'autocommit_mode', False)
+
+ def run(self, sql, autocommit=False, parameters=None):
+ """
+ Snowflake-connector doesn't allow natively the execution of multiple
SQL statements in the same
+ call. So for allowing to pass files or strings with several queries
this method is coded,
+ that relies on run from DBApiHook
+ """
+ if isinstance(sql, str):
+ with closing(self.get_conn()) as conn:
+ if self.supports_autocommit:
Review comment:
this generic `if self.supports_autocommit` logic makes sense in
DbApiHook because it's a generic base class that means to support many
different databases -- some which support autocommit and some which don't. in
your case here, you are implementing for one specific database and you _know_
whether it supports autocommit or not -- so you don't need to check --- just do
the right thing.
##########
File path: airflow/providers/snowflake/hooks/snowflake.py
##########
@@ -142,3 +141,18 @@ def set_autocommit(self, conn, autocommit: Any) -> None:
def get_autocommit(self, conn):
return getattr(conn, 'autocommit_mode', False)
+
+ def run(self, sql, autocommit=False, parameters=None):
+ """
+ Snowflake-connector doesn't allow natively the execution of multiple
SQL statements in the same
+ call. So for allowing to pass files or strings with several queries
this method is coded,
+ that relies on run from DBApiHook
+ """
+ if isinstance(sql, str):
+ with closing(self.get_conn()) as conn:
+ if self.supports_autocommit:
+ self.set_autocommit(conn, autocommit)
+
+ conn.execute_string(sql, parameters)
+ else:
+ super().run(sql, autocommit, parameters)
Review comment:
@JavierLopezT what you want to do is use mock.patch to verify that in
the `run` method, the right underlying method is called given the right input.
google for basic examples of mock patch.
in this case you would want to mock the `get_conn` method so that its
`return_value` is a mock object. then after `run` finishes, you can verify
that the mock object was called with method `execute_string` and your value for
`sql`
there are lots of examples in the repo just try to find something similar
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]