In Airflow 1.6.2, all of the concurrency controls are sometimes ignored and many tasks are scheduled simultaneously. I don't know if this has been completely fixed. You can rely on them to separate your task runs *most* of the time, but not *all* of the time- so don't write code that depends on exclusive operation.
Lance On Thu, Aug 11, 2016 at 1:15 PM, Kurt Muehlner <[email protected]> wrote: > I’m not aware of a concurrency limit at task granularity, however, one > available option is the ‘max_active_runs’ parameter in the DAG class. > > max_active_runs (int) – maximum number of active DAG runs, beyond this > number of DAG runs in a running state, the scheduler won’t create new > active DAG runs > > I’ve used the ‘pool size of 1’ option you mention as a very simple way to > ensure two DAGs run in serial. > > Kurt > > On 8/11/16, 6:57 AM, "הילה ויזן" <[email protected]> wrote: > > should I use pool of size 1? > > On Thu, Aug 11, 2016 at 4:46 PM, הילה ויזן <[email protected]> wrote: > > > Hi, > > I searched in the documentation for a way to limit a specific task > > concurrency to 1, > > but didn't find a way. > > I thought that 'depends_on_past' should achieve this goal, but I > want the > > task to run even if the previous task failed - just to be sure the > they > > don't run in parallel. > > > > The task doesn't have a downstream task, so I can't use > > 'wait_for_downstream'. > > > > Am I Missing something? > > > > Thanks, > > Hila > > > > > > > -- Lance Norskog [email protected] Redwood City, CA
