uranusjr commented on a change in pull request #19736:
URL: https://github.com/apache/airflow/pull/19736#discussion_r753892600



##########
File path: airflow/providers/databricks/hooks/databricks.py
##########
@@ -493,3 +504,120 @@ def __init__(self, token: str) -> None:
     def __call__(self, r: PreparedRequest) -> PreparedRequest:
         r.headers['Authorization'] = 'Bearer ' + self.token
         return r
+
+
+class DatabricksAsyncHook(DatabricksHook):
+    """
+    Async version of the ``DatabricksHook``
+    Implements only necessary methods used further in Databricks Triggers.
+    """
+
+    def __init__(self, *args: Any, **kwargs: Any) -> None:
+        super().__init__(*args, **kwargs)
+
+    async def __aenter__(self):
+        self._session = aiohttp.ClientSession()

Review comment:
       Note that hooks are not necessarily run in the same process, so if you 
want to share sessions among them, you must move the abstraction to the trigger 
instead.
   
   I wonder how viable it’d be to refactor the sychronous version 
(`DatabricksHook`) into a [sans I/O](https://sans-io.readthedocs.io/) base 
class that can be used by `DatabricksExecutionTrigger`, instead of implementing 
a new, entirely separate hook for it.

##########
File path: setup.py
##########
@@ -244,6 +244,8 @@ def write_version(filename: str = os.path.join(*[my_dir, 
"airflow", "git_version
 ]
 databricks = [
     'requests>=2.26.0, <3',
+    'aiohttp>=3.6.3, <4',
+    'asynctest~=0.13',

Review comment:
       This should not be a part of the provider extra, but in the `tests` 
extra instead.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to