I’m not aware of a concurrency limit at task granularity, however, one 
available option is the ‘max_active_runs’ parameter in the DAG class.

  max_active_runs (int) – maximum number of active DAG runs, beyond this number 
of DAG runs in a running state, the scheduler won’t create new active DAG runs

I’ve used the ‘pool size of 1’ option you mention as a very simple way to 
ensure two DAGs run in serial.

Kurt

On 8/11/16, 6:57 AM, "הילה ויזן" <[email protected]> wrote:

    should I use pool of size 1?
    
    On Thu, Aug 11, 2016 at 4:46 PM, הילה ויזן <[email protected]> wrote:
    
    > Hi,
    > I searched in the documentation for a way to limit a specific task
    > concurrency to 1,
    > but didn't find a way.
    > I thought that 'depends_on_past' should achieve this goal, but I want the
    > task to run even if the previous task failed - just to be sure the they
    > don't run in parallel.
    >
    > The task doesn't have a downstream task, so I can't use
    > 'wait_for_downstream'.
    >
    > Am I Missing something?
    >
    > Thanks,
    > Hila
    >
    >
    

Reply via email to