Hi George,

 Thanks so much for your reply.

 I am running Airflow v1.7.1.3
 I am running two AWS instances.
 Both are running celery worker and one is running webserver+scheduler (so,
only one scheduler).

My scheduler is not set to restart periodically (should I ? how often to
restart it ?)

-Jason


On Thu, May 25, 2017 at 1:54 PM, George Leslie-Waksman <
[email protected]> wrote:

> That is the expected behavior. What happened is two worker processes each
> grabbed a request of TaskB1, but one of them noticed, and left it to the
> other one.
>
> There are a handful of reasons it might be showing up in your logs.
>
> Which version of Airflow are you running? Is your scheduler set to restart
> periodically? Are you running more than one scheduler?
>
> On Sat, May 20, 2017 at 6:53 PM Jason Chen <[email protected]>
> wrote:
>
> > Hi Airflow team,
> >
> >
> > I am using airflow with celery (2 nodes; i.e., two AWS instances)
> > My dag looks like below (the python dag name is task_ABC.py). Note in the
> > dag python file, I setup "max_active_runs=1"
> >
> >
> >            /---------> TaskB1 -----------> TaskC1---------\
> > TaskA -----------> TaskB2  ----------> TaskC2----------> TaskD
> >            \----------> TaskB3  -----------> TaskC3--------/
> >
> > So, After TaskA; it runs TaskB1, TaskB2 and TaskB3 simultaneously.
> TaskB1,
> > B2 and B3 are running same shell-script (TaskB.sh) with different input
> > arguments. It drops "Another instance is running, skipping" warning for
> > TaskB1 and TaskB3 (as the log below). It did not drop same warning in
> > TaskB2, I think it's because TaskB2 is running in different celery node
> (I
> > have two celery nodes).
> > If I manually make TaskB1 as successful, TaskB3 can proceed
> >
> > The following is the log. Any idea to handle this ?
> > Thanks.
> >
> > -Jason
> >
> > ========= Log of TaskB1 ============
> >
> > [2017-05-20 23:09:47,270] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-20 23:09:49,017] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-20 23:09:49,165] {models.py:1196} INFO -
> >
> > ------------------------------------------------------------
> --------------------
> > Starting attempt 1 of 2
> >
> > ------------------------------------------------------------
> --------------------
> >
> > [2017-05-20 23:09:49,182] {models.py:1219} INFO - Executing
> > <Task(PythonOperator): TaskB1> on 2017-05-20 03:40:00
> > [2017-05-20 23:09:49,214] {task_ABC.py:185} INFO -
> > /mycode/process/gfs0p25/TaskB.sh 2017052012 <(201)%20705-2012> rain
> > [2017-05-21 00:09:56,054] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-21 00:09:59,759] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-21 00:10:00,008] {models.py:1146} WARNING - Another instance
> > is running, skipping.
> >
> >
> > ========= Log of TaskB3 ============
> >
> > [2017-05-20 23:09:44,660] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-20 23:09:46,047] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-20 23:09:46,205] {models.py:1196} INFO -
> >
> > ------------------------------------------------------------
> --------------------
> > Starting attempt 1 of 2
> >
> > ------------------------------------------------------------
> --------------------
> >
> > [2017-05-20 23:09:46,224] {models.py:1219} INFO - Executing
> > <Task(PythonOperator): TaskB3> on 2017-05-20 03:40:00
> > [2017-05-20 23:09:46,257] {best_weather-BLEND-v1-1-0.py:245} INFO -
> > /mycode/process/gfs0p25/TaskB.sh 2017052012 <(201)%20705-2012> snow
> > [2017-05-21 00:09:48,029] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-21 00:09:49,080] {models.py:154} INFO - Filling up the DagBag
> > from /code/task_ABC.py
> > [2017-05-21 00:09:49,156] {models.py:1146} WARNING - Another instance
> > is running, skipping.
> >
>

Reply via email to