GitHub user ksidata added a comment to the discussion: Dagbag import is taking 
longer time than an older version, which makes some tasks get timeout exceed 
errors

@potiuk Thank you for your answer, I guess it's most likely the reason (or at 
least an accurate explanation) to our timeout issues. However, the only 
ambiguity that's left, is why scaling up workers (by adding memory -x2- and cpu 
-x1.5-) didn't help to significally reduce this time cost for imports, are 
there any configs that can limit memory consumption of a single worker process 
for example no matter how much we scale up the worker node ? or what else could 
be the reason behind, having timeout issues due to a heavier import load in 
newer versions, and at the same time, scaling up doesn't seem to help to reduce 
the time costs ? 

GitHub link: 
https://github.com/apache/airflow/discussions/44402#discussioncomment-11395714

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to