scr-oath commented on issue #33020:
URL: https://github.com/apache/airflow/issues/33020#issuecomment-1663157340

   Well… this does seem like a reasonable approach, but then… it tightly 
couples the two dags.
   
   There are some colleagues looking to version their dags by essentially 
adjusting the name of the file during deployment and using 
`Path(__file__).stem` as the dag_id.  They're aware of 
https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-36+DAG+Versioning but 
also not certain of its ETA.
   
   Using the approach you suggested would not allow decoupling/loose-coupling 
of the producer and consumer dag through the dataset.
   
   But it _DOES_ make me think that maybe the feature would be to carry the 
"triggering information" with dataset triggers… if the consumer dag (inlet in 
your case) could be told the name of the dag and/or dataset (if multiple) that 
triggered it, then it could ask for xcomm information by dag_id…
   
   However, it makes me think "what if the producer ran multiple times before 
the downstream got its trigger" - not sure how often that would be the case - 
but… would the above solution allow for multiple separate triggers to carry 
their own xcom information related to their work?  Conceptually, you want to be 
able to get an event about each dataset write about what happened and I feel 
like using XCOM has the chance of race-conditions or skipping events - if you 
go read it - does it just always get the "last value"?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to