MkSafavi opened a new issue, #29959:
URL: https://github.com/apache/airflow/issues/29959

   ### Description
   
   expanding tasks in batches to allow mapped tasks spawn more than 1024 
processes.
   
   ### Use case/motivation
   
   Maximum length of a list is limited to 1024 by `max_map_length 
(AIRFLOW__CORE__MAX_MAP_LENGTH)`.
   during scheduling of the new tasks, an UPDATE query is ran that tries to set 
all the new tasks at once.  Increasing `max_map_length` more than 4K makes 
airflow scheduler completely unresponsive.
   
   Also, Postgres throws `stack depth limit exceeded` error which can be fixed 
by update to a newer version and setting `max_stack_depth`. But it doesn't 
really matter because airflow scheduler freezes up.
    
   As a workaround, I split the dag runs into subdag runs which works but it 
would be much nicer if we didn't have to worry about exceeding  
`max_map_length`. 
   
   ### Related issues
   
   It was discussed here:
   [Increasing 'max_map_length' leads to SQL 'max_stack_depth' error with 5000 
dags to be spawned #28478](https://github.com/apache/airflow/discussions/28478)
   
   ### Are you willing to submit a PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to