Re: Task is Stuck in Up_For_Retry

2018-08-17 Thread ramandumcs
We are getting the logs like {local_executor.py:43} INFO - LocalWorker running airflow run {models.py:1595} ERROR - Executor reports task instance %s finished (%s) although the task says its %s. Was the task killed externally? {models.py:1616} INFO - Marking task as UP_FOR_RETRY It seems that

Getting Status of a External Task

2018-08-17 Thread naveen . csu
Hi, I would like to know on how to get the status of a task & then based on that I would like to pass the values to the next job as a value. Example : Task1 -- Run for 2 hours Task2 - Check on task1, if complete - Then a message is sent saying task1 is complete. If Failed - Send a msg as

Re: Task is Stuck in Up_For_Retry

2018-08-17 Thread Matthias Huschle
Hi Raman, Does it happen only occasionally, or can it be easily reproduced? What happens if you start it with "airflow run" or " airflow test"? What is in the logs about it? What is your user process limit ("ulimit -u") on that machine? 2018-08-17 15:39 GMT+02:00 ramandu...@gmail.com : >

Re: Task is Stuck in Up_For_Retry

2018-08-17 Thread ramandumcs
Thanks Taylor, We are getting this issue even after restart. We are observing that task instance state is transitioned from scheduled->queued->up_for_retry and dag gets stuck in up_for_retry state. Behind the scenes executor keep on retrying the dag's task exceeding the max retry limit. In