Hi Germain,

As long as the structure of the DAG is not changed (tasks are the same and
the dependency graph does not change), there should be no need to restart
anything.

The scheduler only needs the structure of the DAG to send the right message
to celery. Essentially the message tells the worker to run an airflow run
command for this dag_id, this task_id and the execution_date.
While the webserver for instance might show you an older version of the
bash script, the code executed will be the latest available on the worker.
You should be able to check this by checking the logs for the task, since
the script is usually logged there.

I hope this helps,

Sincerely,
Arthur


On Mon, Jul 17, 2017 at 11:56 PM, Germain TANGUY <
germain.tan...@dailymotion.com> wrote:

> Hello everybody,
>
> I would like to know what are your procedure to deploy new versions of
> your DAGs, especially for dags that have external dependencies (bash
> script..etc)
> I use CeleryExecutor with multiples workers and so there is an issue of
> consistency between workers, schedulers and webserver.
>
> Today I pause the dags, I wait until all running tasks complete, I restart
> all airflow services and unpause the dags. Is there a better way?
>
> Best regards,
>
> Germain T.
>
>

Reply via email to