ashb commented on issue #34206:
URL: https://github.com/apache/airflow/issues/34206#issuecomment-1711502522

   Yeah, having some way of doing this would be useful.
   
   I think we'll need two things here:
   1. A "nice" way of adding a dataset from inside task execution (such that it 
makes it to the DB)
   2. A way of marking those Datasets as "execution_time"/dynamic such that the 
dataset orphan cleanup code doesn't remove them again.
   
   We probably to document this clearly that the datasets should be "complete" 
-- i.e. whatever outlets/inlets you set at execution time will be taken as the 
total set, and also document it clearly that the datasets for this Task will be 
over-written each time a tasks runs. I.e. this does not allow the datasets to 
"change" from execution to execution. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to