dstandish commented on code in PR #24908:
URL: https://github.com/apache/airflow/pull/24908#discussion_r917272208


##########
airflow/models/taskinstance.py:
##########
@@ -1513,10 +1513,10 @@ def _run_raw_task(
         if not test_mode:
             session.add(Log(self.state, self))
             session.merge(self)
-            self._create_dataset_dag_run_queue_records(session=session)
+            self._create_dataset_dag_run_queue_records(context=context, 
session=session)
             session.commit()
 
-    def _create_dataset_dag_run_queue_records(self, *, session):
+    def _create_dataset_dag_run_queue_records(self, *, context: Context = 
None, session: Session):

Review Comment:
   SO @jedcunningham  this is a remnant that reveals part of the secret master 
plan.  
   
   We [ultimately, eventually] need some way to allow `the tasks that do the 
updates` to transmit metadata to `the tasks that handle the datasets that were 
updated`.
   
   When starting this branch i was playing around with injecting that metadata 
into task context (from the _writing_ task), then here grabbing that metadata 
out of context and storing in in the `extra` field when stamping the dataset 
event.
   
   BUT we're not adding that feature in this PR so, chopping is a good call. I 
just missed it when pulling that stuff out.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to