Hi Harvey,

Thanks for reporting! Can you create a lira for this? I’ll have a look if I can 
reproduce it.

- Bolke

> On 11 Jan 2017, at 16:06, Harvey Xia <harvey...@spotify.com.INVALID> wrote:
> 
> Hi all,
> 
> In Airflow 1.8 alpha 2, using LocalExecutor, DAGs do not seem to honor the
> retry_delay parameter, i.e. the retries happen immediately one after the
> other without waiting the specific retry_delay time. However, the *number*
> of retries is honored. I am testing with the following code:
> 
> from airflow import DAG
> from airflow.operators.bash_operator import BashOperator
> from datetime import datetime, timedelta
> 
> default_args = {
> 'owner': 'airflow',
> 'depends_on_past': False,
> 'start_date': datetime(2016, 10, 5, 19),
> 'end_date': datetime(2016, 10, 6, 19),
> 'email': ['airf...@airflow.com'],
> 'email_on_failure': False,
> 'email_on_retry': False,
> 'retries': 10,
> 'retry_delay': timedelta(0, 500)
> }
> 
> dag = DAG('test_retry_handling_job', default_args=default_args,
> schedule_interval='@once')
> 
> task1 = BashOperator(
> task_id='test_retry_handling_op1',
> bash_command='exit 1',
> dag=dag)
> 
> task2 = BashOperator(
> task_id='test_retry_handling_op2',
> bash_command='exit 1',
> dag=dag)
> 
> task2.set_upstream(task1)
> 
> Let me know if anyone has any ideas about this issue, thanks!
> 
> Harvey Xia | Software Engineer
> harvey...@spotify.com
> +1 (339) 225 1875

Reply via email to