dstandish commented on code in PR #33718:
URL: https://github.com/apache/airflow/pull/33718#discussion_r1306179079
##########
airflow/sensors/time_delta.py:
##########
@@ -64,7 +64,11 @@ class TimeDeltaSensorAsync(TimeDeltaSensor):
def execute(self, context: Context):
target_dttm = context["data_interval_end"]
target_dttm += self.delta
- self.defer(trigger=DateTimeTrigger(moment=target_dttm),
method_name="execute_complete")
+ self.defer(
+ trigger=DateTimeTrigger(moment=target_dttm),
+ method_name="execute_complete",
+ timeout=self.timeout,
Review Comment:
OK so it sounds like you're saying that the problem with the approach in the
PR currently is that, after a retry, when we defer again, we don't adjust the
timeout to be relative to the original start_date.
And it sounds like what we would need to do is, timeout that we pass to
TaskDeferred must be calculated as `(orig_start_date + self.timeout) -
current_time`. And then, the sensor timeout will be evaluated relative to the
first attempt, and not from the start date of a retry. And then perhaps this
would make the behavior consistent?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]