potiuk commented on a change in pull request #4353: [AIRFLOW-3480] Add Database 
Deploy/Update/Delete operators
URL: https://github.com/apache/incubator-airflow/pull/4353#discussion_r244038274
 
 

 ##########
 File path: airflow/contrib/hooks/gcp_spanner_hook.py
 ##########
 @@ -147,15 +149,13 @@ def _apply_to_instance(self, project_id, instance_id, 
configuration_name, node_c
         :param func: Method of the instance to be called.
         :type func: Callable
         """
-        client = self.get_client(project_id)
-        instance = client.instance(instance_id,
-                                   configuration_name=configuration_name,
-                                   node_count=node_count,
-                                   display_name=display_name)
+        instance = self.get_client(project_id).instance(
 
 Review comment:
   I'd rather wait with hook tests to subsequent PRs (and maybe we solve it in 
a bit different way). I think it is rather difficult to test automatically the 
hooks to GCP using mocking/classic unit tests of Airflow. Please let me know 
what you think about the approach we use? Maybe we are missing something and we 
can add some unit test for hooks which could add value :)?
   
   We chose a slightly different strategy for testing GCP operators (and hooks):
   
   * First of all we try to make hooks as straightforward as possible - 
basically 1-1 operation of an existing library method but we make it 
synchronous - waiting for operation to complete. Then we put pretty much all 
logic into operators. 
   
   * What we realised is that we can mostly test with mocking and unit tests in 
this case, is whether this particular library method has been called. Which is 
a bit redundant - that's why we do not have those hook tests. 
   
   * instead we run automated system tests ("we" means the team from my company 
Polidea - 3 people who work on those operators). We use example DAGs 
(`example_gcp_spanner.py` in this case) to run them automatically with a real 
GCP_PROJECT. We even have a way to run them using (skipped by default) unit 
testss (see for example CloudSpannerExampleDagsTest). We have a separate and 
quite sophisticated environment for that - we run it automatically in GCP Cloud 
Build on every push and this way we test e-2-e whether the operators (and thus 
hooks) work with real GCP. 
   
   I am going to try to contribute it very soon to the community (it's already 
open-sourced and I am polishing it) so that others will be able to do the same 
with their own GCP projects. You can see it here 
https://github.com/PolideaInternal/airflow-breeze and I will be happy to 
involve you for comments/re view when we are ready to share it (I think in a 
couple of days).
   
   What do you think?
   
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to