notatallshaw commented on PR #39946: URL: https://github.com/apache/airflow/pull/39946#issuecomment-2143545389
> Also @notatallshaw -> this is probably as close as it can get to the #39100 you opened and likely will make your `apache-airflow[all]` installations quite a bit faster. UPDATE: Actually you wanted to put limits for airflow -> provider dependencies, so it's a bit differen (those are limits on airlfow and provider's own dependencies) - but let's see where it can get us. Yeah, it's not quite the same thing, the reason I asked for putting limits on the `airflow -> provider dependencies` is that it exposes resolution algorithms to the possibility of trying to exhaust all versions of a provider dependencies. So while new lower limits on direct dependencies of providers will in general speed up the typical use case, and the extra testing is great, it will not prevent pathalogical backtracking, particularly if the user has additional requirements. Specifically, if the user is using uv, with Python 3.11, and uv ends up backtracking on the apache-beam provider such they hit these two requirements: `"dill>=0.2.2" "apache-beam<=2.49.0"`, then uv will exhaustively backtrack and try to build an sdist that can not be built, and the resolution will fail. Fortunately the latest version of `apache-airflow[all]` by itself does not cause that either on main or this branch! I did a little performance testing with uv, I see `apache-airflow[all]` resolution speed up by 10% with cache, though without cache the difference was not measurable (uv is too fast on non-pathalogical resolutions so the network cost dominates the amount of time). I will try and get round to testing with pip. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
