sprzedwojski commented on a change in pull request #4251: [AIRFLOW-2440] Add 
Google Cloud SQL import/export operator
URL: https://github.com/apache/incubator-airflow/pull/4251#discussion_r237425378
 
 

 ##########
 File path: airflow/contrib/hooks/gcp_sql_hook.py
 ##########
 @@ -254,6 +254,54 @@ def delete_database(self, project, instance, database):
         operation_name = response["name"]
         return self._wait_for_operation_to_complete(project, operation_name)
 
+    def export_instance(self, project_id, instance_id, body):
+        """
+        Exports data from a Cloud SQL instance to a Cloud Storage bucket as a 
SQL dump
+        or CSV file.
+
+        :param project_id: Project ID of the project where the instance exists.
+        :type project_id: str
+        :param instance_id: Name of the Cloud SQL instance. This does not 
include the
+            project ID.
+        :type instance_id: str
+        :param body: The request body, as described in
+            
https://cloud.google.com/sql/docs/mysql/admin-api/v1beta4/instances/export#request-body
+        :type body: dict
+        :return: True if the operation succeeded, raises an error otherwise
+        :rtype: bool
+        """
+        response = self.get_conn().instances().export(
 
 Review comment:
   Thanks, it's a valid point. However, looking at other methods in this hook 
and also in other GCP-related hooks we've created recently, we don't do 
`try`...`catch` anywhere.
   
   Therefore I thought that maybe for the sake of coherence we could merge this 
"as is", and then I'd make a separate PR correcting this in all the GCP hooks?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to