ashb commented on code in PR #24743:
URL: https://github.com/apache/airflow/pull/24743#discussion_r912883209
##########
airflow/models/dagrun.py:
##########
@@ -631,6 +631,68 @@ def update_state(
session.merge(self)
# We do not flush here for performance reasons(It increases queries
count by +20)
+ from airflow.models import Dataset
Review Comment:
For clarity (and perhaps ease of testing) I would suggest moving all this
logic to a separate (private?) function
##########
airflow/models/dataset_dag_ref.py:
##########
@@ -0,0 +1,60 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from sqlalchemy import Column, ForeignKeyConstraint, Integer, String
+from sqlalchemy.orm import relationship
+
+from airflow.models.base import ID_LEN, Base
+from airflow.utils import timezone
+from airflow.utils.sqlalchemy import UtcDateTime
+
+
+class DatasetDagRef(Base):
+ """References from a DAG to an upstream dataset."""
+
+ dataset_id = Column(Integer, primary_key=True, nullable=False)
+ dag_id = Column(String(ID_LEN), primary_key=True, nullable=False)
+ created_at = Column(UtcDateTime, default=timezone.utcnow, nullable=False)
+ updated_at = Column(UtcDateTime, default=timezone.utcnow,
onupdate=timezone.utcnow, nullable=False)
+
+ dataset = relationship('Dataset')
+
+ __tablename__ = "dataset_dag_ref"
+ __table_args__ = (
+ ForeignKeyConstraint(
+ (dataset_id,),
+ ["dataset.id"],
+ name='dataset_event_dataset_fkey',
+ ondelete="CASCADE",
+ ),
+ )
+
+ def __eq__(self, other):
+ if isinstance(other, self.__class__):
+ return self.dataset_id == other.dataset_id and self.dag_id ==
other.dag_id
+ else:
+ return NotImplemented
+
+ def __hash__(self):
+ return hash((self.uri, self.extra))
Review Comment:
Do we need these, or can we use what ever behaviour SQLA defines?
##########
airflow/models/dataset_dag_ref.py:
##########
@@ -0,0 +1,60 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from sqlalchemy import Column, ForeignKeyConstraint, Integer, String
+from sqlalchemy.orm import relationship
+
+from airflow.models.base import ID_LEN, Base
+from airflow.utils import timezone
+from airflow.utils.sqlalchemy import UtcDateTime
+
+
+class DatasetDagRef(Base):
Review Comment:
I personally would have defined all these classes inside
`airflow/models/dataset.py` as I don't like the one-class-per-module pattern.
(Especially with the relationship accessor change I suggest which I think
should mean we never need to access these classes directly anyway.)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]