TobKed commented on a change in pull request #8553:
URL: https://github.com/apache/airflow/pull/8553#discussion_r508321399



##########
File path: airflow/providers/google/cloud/hooks/dataflow.py
##########
@@ -783,6 +794,77 @@ def cancel_job(
             name=job_name,
             job_id=job_id,
             location=location,
-            poll_sleep=self.poll_sleep
+            poll_sleep=self.poll_sleep,
+            num_retries=self.num_retries,
         )
         jobs_controller.cancel()
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def start_sql_job(
+        self,
+        job_name: str,
+        query: str,
+        options: Dict[str, Any],
+        project_id: str,
+        location: str = DEFAULT_DATAFLOW_LOCATION,
+        on_new_job_id_callback: Optional[Callable[[str], None]] = None
+    ):
+        """
+        Starts Dataflow SQL query.
+
+        :param job_name: The unique name to assign to the Cloud Dataflow job.
+        :type job_name: str
+        :param query: The SQL query to execute.
+        :type query: str
+        :param options: Job parameters to be executed.

Review comment:
       This is related to what @mik-laj said here: 
https://github.com/apache/airflow/pull/8553#discussion_r508048041
   I added parametrization to example dag so it would be nice hint for the 
users in case of doubts how to use it.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to