anon-john opened a new issue #9046:
URL: https://github.com/apache/airflow/issues/9046


   **Apache Airflow version**: 1.10.10
   **Environment**:
   - **OS** (e.g. from /etc/os-release): centos7, Ubuntu 18.04.4 LTS
   - **Install tools**: pip inside virtualenv
   
   
   Using TimeDeltaSensor inside a DAG with `schedule_interval="@once"` leads to 
the following error:
   ```
   [2020-05-28 09:41:59,409] {taskinstance.py:669} INFO - Dependencies all met 
for <TaskInstance: test_once.wait_10s 2020-05-26T00:00:00+00:00 [queued]>
   [2020-05-28 09:41:59,417] {taskinstance.py:669} INFO - Dependencies all met 
for <TaskInstance: test_once.wait_10s 2020-05-26T00:00:00+00:00 [queued]>
   [2020-05-28 09:41:59,418] {taskinstance.py:879} INFO -
   
--------------------------------------------------------------------------------
   [2020-05-28 09:41:59,418] {taskinstance.py:880} INFO - Starting attempt 1 of 
2
   [2020-05-28 09:41:59,418] {taskinstance.py:881} INFO -
   
--------------------------------------------------------------------------------
   [2020-05-28 09:41:59,425] {taskinstance.py:900} INFO - Executing 
<Task(TimeDeltaSensor): wait_10s> on 2020-05-26T00:00:00+00:00
   [2020-05-28 09:41:59,427] {standard_task_runner.py:53} INFO - Started 
process 26581 to run task
   [2020-05-28 09:41:59,496] {logging_mixin.py:112} INFO - Running %s on host 
%s <TaskInstance: test_once.wait_10s 2020-05-26T00:00:00+00:00 [running]> 
muttley
   [2020-05-28 09:41:59,505] {taskinstance.py:1145} ERROR - unsupported operand 
type(s) for +=: 'NoneType' and 'datetime.timedelta'
   Traceback (most recent call last):
     File 
"/home/tfruboes/tmp/2020.05.dag_once/venv/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 983, in _run_raw_task
       result = task_copy.execute(context=context)
     File 
"/home/tfruboes/tmp/2020.05.dag_once/venv/lib/python3.6/site-packages/airflow/sensors/base_sensor_operator.py",
 line 107, in execute
       while not self.poke(context):
     File 
"/home/tfruboes/tmp/2020.05.dag_once/venv/lib/python3.6/site-packages/airflow/sensors/time_delta_sensor.py",
 line 44, in poke
       target_dttm += self.delta
   TypeError: unsupported operand type(s) for +=: 'NoneType' and 
'datetime.timedelta'
   [2020-05-28 09:41:59,506] {taskinstance.py:1168} INFO - Marking task as 
UP_FOR_RETRY
   [2020-05-28 09:42:09,404] {logging_mixin.py:112} INFO - [2020-05-28 
09:42:09,403] {local_task_job.py:103} INFO - Task exited with return code 1
   ```
   (with `@daily` it runs ok). I would guess, that TimeDeltaSensor should run 
fine for any schedule interval, especially that combining it with `@daily` is 
useful for debugging/development purposes. 
   
   DAG definition, leading to the above crash, is the following:
   ```
   from airflow import DAG
   from airflow.operators.sensors import TimeDeltaSensor
   from airflow.utils.dates import days_ago
   from datetime import datetime, timedelta
   
   default_args = {
       "owner": "airflow", 
       "depends_on_past": False,
       "start_date": days_ago(2),
       "email": ["[email protected]"],
       "email_on_failure": False,
       "email_on_retry": False,
       "retries": 1,
       "retry_delay": timedelta(minutes=5),
       "schedule_interval": None,
   }
   
   dag = DAG("test_once", default_args=default_args, 
is_paused_upon_creation=False, schedule_interval = "@once")
   
   delay = TimeDeltaSensor(
              task_id='wait_10s',
               delta=timedelta(seconds=10),
               dag=dag)
   ```
   Moreover (not sure if it's a bug) - scheduler reports a success, despite the 
problem visible in the task log:
   ```
   INFO - Executor reports execution of test_once.wait_10s 
execution_date=2020-05-26 00:00:00+00:00 exited with status success for 
try_number 1
   ```
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to