Lee-W commented on code in PR #40868:
URL: https://github.com/apache/airflow/pull/40868#discussion_r1687404008


##########
airflow/datasets/__init__.py:
##########
@@ -271,6 +306,20 @@ def iter_datasets(self) -> Iterator[tuple[str, Dataset]]:
                 yield k, v
                 seen.add(k)
 
+    def iter_dag_deps(self, *, source: str, target: str) -> 
Iterator[DagDependency]:
+        """
+        Iterate dataset, dataset aliases and their resolved datasets  as dag 
dependency.
+
+        :meta private:
+        """
+        dag_deps: set[DagDependency] = set()

Review Comment:
   oh, in the current dateset_dependecnies endpoint, I think it'll be filtered 
out anyway. so we can also defer the deduplication later as well



##########
airflow/serialization/dag_dependency.py:
##########


Review Comment:
   I feel it is still something mainly for `serialisation`. `iter_dag_deps` is 
mainly just for `serialisation` and later be used in the view.py. 
   
   I do not object to moving it to `utils`. I'm just not sure moving lots of 
things there is a good idea 🤔 



##########
airflow/datasets/__init__.py:
##########
@@ -271,6 +306,20 @@ def iter_datasets(self) -> Iterator[tuple[str, Dataset]]:
                 yield k, v
                 seen.add(k)
 
+    def iter_dag_deps(self, *, source: str, target: str) -> 
Iterator[DagDependency]:
+        """
+        Iterate dataset, dataset aliases and their resolved datasets  as dag 
dependency.
+
+        :meta private:
+        """
+        dag_deps: set[DagDependency] = set()

Review Comment:
   > Is the order of DAG dependencies significant? If not, I think we can 
deduplicate at where this function is called instead. 
   
   I think not. but I'll test out.
   
   > Or, do we need to deduplicate dependencies at all…?
   
   I would probably a +0? e.g., `scheudle=[Datset("test-1"), 
Dataset("test-1")]` should we generate 2 `Dataset("test-1")`? On the other 
hand, it's also what users want 🤔  I feel it's more like a UI thing and would 
like to know how @bbovenzi  think



##########
airflow/datasets/__init__.py:
##########
@@ -271,6 +306,20 @@ def iter_datasets(self) -> Iterator[tuple[str, Dataset]]:
                 yield k, v
                 seen.add(k)
 
+    def iter_dag_deps(self, *, source: str, target: str) -> 
Iterator[DagDependency]:
+        """
+        Iterate dataset, dataset aliases and their resolved datasets  as dag 
dependency.
+
+        :meta private:
+        """
+        dag_deps: set[DagDependency] = set()

Review Comment:
   I feel that not deduplicating might make it even more complicated.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to