[ 
https://issues.apache.org/jira/browse/AIRFLOW-4785?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17077897#comment-17077897
 ] 

ASF GitHub Bot commented on AIRFLOW-4785:
-----------------------------------------

tooptoop4 commented on pull request #8080: [AIRFLOW-4785] Don't apply limits on 
running DummyOperator
URL: https://github.com/apache/airflow/pull/8080
 
 
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Config flag to exclude dummyoperator from having Unknown 'Dependencies 
> Blocking Task From Getting Scheduled'
> ------------------------------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-4785
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-4785
>             Project: Apache Airflow
>          Issue Type: Improvement
>          Components: dependencies, scheduler
>    Affects Versions: 1.10.3
>            Reporter: t oo
>            Assignee: t oo
>            Priority: Minor
>
> My Dag has tasks from 12 different types of operators. One of the operators 
> is the dummyoperator (which is meant to do 'nothing') but it can't be run 
> during busy times as the '{{parallelism}}, {{dag_concurrency}}, 
> {{max_active_dag_runs_per_dag}}, {{non_pooled_task_slot_count' }}limits have 
> been met (so it is stuck in scheduled state). I would like a new config flag 
> (dont_block_dummy=True) with the ability for dummyOperator tasks to always 
> get run even if the parallelism.etc limits are met. Without this feature, the 
> only workaround for this is to make a huge parallelism limit (above now) and 
> then give pools to all the other operators in my dag. But my idea is that 
> dummyOperator should not have limits as it is not a resource hog.
>  
> h4. Task Instance Details
> h5. Dependencies Blocking Task From Getting Scheduled
> ||Dependency||Reason||
> |Unknown|All dependencies are met but the task instance is not running. In 
> most cases this just means that the task will probably be scheduled soon 
> unless:
> - The scheduler is down or under heavy load
> - The following configuration values may be limiting the number of queueable 
> processes: {{parallelism}}, {{dag_concurrency}}, 
> {{max_active_dag_runs_per_dag}}, {{non_pooled_task_slot_count}}
>  
> If this task instance does not start soon please contact your Airflow 
> administrator for assistance.|



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to