chirodip98 commented on code in PR #60688:
URL: https://github.com/apache/airflow/pull/60688#discussion_r2724532487


##########
providers/google/src/airflow/providers/google/cloud/hooks/datafusion.py:
##########
@@ -469,33 +477,50 @@ def start_pipeline(
             is always default. If your pipeline belongs to an Enterprise 
edition instance, you
             can create a namespace.
         """
-        # TODO: This API endpoint starts multiple pipelines. There will 
eventually be a fix
-        #  return the run Id as part of the API request to run a single 
pipeline.
-        #  https://github.com/apache/airflow/pull/8954#discussion_r438223116
         url = os.path.join(
             instance_url,
             "v3",
             "namespaces",
             quote(namespace),
             "start",
         )
+
         runtime_args = runtime_args or {}
+        program_id = self.cdap_program_id(pipeline_type=pipeline_type)
         body = [
             {
                 "appId": pipeline_name,
+                "programType": "workflow" if pipeline_type == 
DataFusionPipelineType.BATCH else "spark",

Review Comment:
   Noted



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to