[ 
https://issues.apache.org/jira/browse/AIRFLOW-687?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16431330#comment-16431330
 ] 

John Arnold commented on AIRFLOW-687:
-------------------------------------

commenting on old Jira for posterity – my solution to this was to just import 
execute_command task from airflow into a "regular" celery app, and register the 
task with the app.  That way, I can run vanilla celery workers using 'celery 
multi' which support graceful restarts etc.  Also allows sharing workers/config 
between airflow and our much larger celery installation.

> Gracefully halt workers through CLI
> -----------------------------------
>
>                 Key: AIRFLOW-687
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-687
>             Project: Apache Airflow
>          Issue Type: Bug
>            Reporter: Paul Zaczkieiwcz
>            Priority: Minor
>
> When deploying a new set of airflow DAGs, it is useful to gracefully shut 
> down all airflow services and restart them.  This allows you to pip install 
> requirements for your DAGs in a virtual environment so that you're sure that 
> your DAGs don't contain unmet dependencies.
> Trouble is, if you kill the celery workers then they'll drop their current 
> task on the floor. There should be a CLI option to gracefully shut down the 
> workers so that deploy scripts can restart all services without worrying 
> about killing the workers.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to