ephraimbuddy commented on pull request #17207:
URL: https://github.com/apache/airflow/pull/17207#issuecomment-902985223


   Now, I reproduced this consistently with LocalExecutor in breeze by running 
the example DAG below. 
   ```
   from airflow import DAG
   import os
   from datetime import datetime, timedelta
   from airflow.operators.python import PythonOperator
   import time
   
   dag = DAG(os.path.basename(__file__).replace('.py', ''),
             start_date=datetime(2021, 5, 11), 
             schedule_interval=timedelta(days=1))
   
   def sleep_tester(time_out, retries):
       for i in range(retries):
           print(f'hi there, try {i}, going to sleep for {time_out}')
           time.sleep(time_out)
           print("Aaah, good times, see ya soon")
   
   
   sleeping = PythonOperator(task_id="sleep_well",
                             python_callable=sleep_tester,
                             op_kwargs={'time_out': 15, 'retries': 50},
                             dag=dag)
   ```
   The first 16 tasks that were queued ran successfully but when the next 16 
was to be run, all the 16 tasks failed.
   This issue looks related to task concurrency.
   Try if you can reproduce it too with LocalExecutor. Run the dag as is and 
allow it to run without stopping it. It will take time but just allow it to run


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to