chinwobble commented on a change in pull request #19736:
URL: https://github.com/apache/airflow/pull/19736#discussion_r753877719
##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -493,3 +504,120 @@ def __init__(self, token: str) -> None:
def __call__(self, r: PreparedRequest) -> PreparedRequest:
r.headers['Authorization'] = 'Bearer ' + self.token
return r
+
+
+class DatabricksAsyncHook(DatabricksHook):
+ """
+ Async version of the ``DatabricksHook``
+ Implements only necessary methods used further in Databricks Triggers.
+ """
+
+ def __init__(self, *args: Any, **kwargs: Any) -> None:
+ super().__init__(*args, **kwargs)
+
+ async def __aenter__(self):
+ self._session = aiohttp.ClientSession()
Review comment:
This is really good stuff.
Currently you are using one Hook per trigger. That means if you had 32
concurrent triggers, then they would all have their own client sessions.
I've had a look at the docs for aiohttp, it says:
```
it is suggested you use a single session for the lifetime of your
application to benefit from connection pooling.
```
https://docs.aiohttp.org/en/stable/client_reference.html
What do you think?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]