andrewgodwin commented on a change in pull request #19736:
URL: https://github.com/apache/airflow/pull/19736#discussion_r758576866
##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -493,3 +504,120 @@ def __init__(self, token: str) -> None:
def __call__(self, r: PreparedRequest) -> PreparedRequest:
r.headers['Authorization'] = 'Bearer ' + self.token
return r
+
+
+class DatabricksAsyncHook(DatabricksHook):
+ """
+ Async version of the ``DatabricksHook``
+ Implements only necessary methods used further in Databricks Triggers.
+ """
+
+ def __init__(self, *args: Any, **kwargs: Any) -> None:
+ super().__init__(*args, **kwargs)
+
+ async def __aenter__(self):
+ self._session = aiohttp.ClientSession()
Review comment:
Yes, there is no easy mechanism for pooling connections that are all
running under a single triggerer right now, but you can also auto-pool from an
async hook implementation by detecting same-thread different-coroutine if you
need to. Just makes it quite a bit more complex.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]