uranusjr commented on issue #19578:
URL: https://github.com/apache/airflow/issues/19578#issuecomment-974681030


   Oh! I misread then, and identified the problem wrong. But the root issue is 
somewhat similar though. In both cases, the command would find an existing DAG 
run, and re-run either one ti or all tis in it.
   
   The problem is that, if a DAG run was created before the timetable was 
changed, the run will not be able to infer a meaningful data interval. Before 
2.2, a DAG run only has `execution_date`, and the data interval needs to be 
inferred, but _the inferring logic should be based on the previous schedule 
definition_ because that’s what scheduled that run, but that schedule no longer 
exists. Furthermore, a custom timetable is not capable of inferring a data 
interval from a logical date, because there’s no such interface on the 
timetable right now, and that’s due to this idea not making logical sense in 
the first place—a DAG run’s logical date and data interval are conceptually 
independent.
   
   So unless we change the semantic for `dags test` and `tasks test` (which is 
not trivial), we either need to “invent” something to tie logical date to data 
interval and give up their independency (with long term consequences), or we 
need to…give up, in a sense. When we hit the situation, we can show a 
friendly-ish message explaining how this does not work (the run you’re trying 
to trigger was created before Airflow 2.2 and is not compatible with custom 
timetables), and provide a solution for the user to get out of this. The 
easiest solution would be to simply delete that run and re-trigger it on 
Airflow 2.2, or we could potentially provide a script for the user to “migrate” 
that existing run to 2.2.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to