raviagarwalunravel commented on a change in pull request #5118: [AIRFLOW-4315] 
Add monitoring API's to airflow
URL: https://github.com/apache/airflow/pull/5118#discussion_r280703401
 
 

 ##########
 File path: airflow/api/common/experimental/get_tasks.py
 ##########
 @@ -34,11 +33,5 @@ def get_dag_run_state(dag_id, execution_date):
     # Get DAG object and check Task Exists
     dag = dagbag.get_dag(dag_id)
 
-    # Get DagRun object and check that it exists
-    dagrun = dag.get_dagrun(execution_date=execution_date)
-    if not dagrun:
-        error_message = ('Dag Run for date {} not found in dag {}'
-                         .format(execution_date, dag_id))
-        raise DagRunNotFound(error_message)
-
-    return {'state': dagrun.get_state()}
+    # Return the task.
+    return dag.task_ids
 
 Review comment:
   What you are saying is correct, but then we would have to get DagBag in the 
endpoints.py file and also do the verification of dag_id existance. I was also 
consiously trying not to touch models from this layer, and defer to all DB 
access tasks to the helper function defined in the api/common/experimental 
folder. This implementation seems cleaner to me, but if you guys would say 
otherwise I will change it.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to