potiuk commented on issue #64954: URL: https://github.com/apache/airflow/issues/64954#issuecomment-4216826950
> It is unexpected for me, that the newer Python version expects the older version of numpy and I considered this an error in the constraint file. Especially seeing, that the Docker image of Airflow 3.2.0 with Python 3.14 brings numpy 2.4.3. Not really: ``` ⏚ [jarekpotiuk:~/code/airflow] main+ 127 ± docker run -it apache/airflow:3.2.0-python3.14 bash -c "pip freeze | grep numpy" numpy==2.2.6 ⏚ [jarekpotiuk:~/code/airflow] main+ ± ``` But I am glad you asked - I understand you are curious, that picked my curiosity as well and made me review if all is good in our pipeline. It looks like it is, but. will likely add some protection on making sure that release manager double checks that the constraints are regenerated "just" before release (this time they were generated few days before release - and we got some timing issue). The reason why in 3.14 we have numpy 2.2.6 is because some of the providers held it from upgraded. If you look a the commits to the constraints-3-2 branch numpy is already upgraded to newer version for 3.14 for the upcoming 3.2.1 release - and it came with a number of providers upgraded - as they were released after the constraints have been resolved automatically by the resolution algorithm: You can see the update in this commit - few days ago - after the constraints for 3.2.0 were generated (this was after new wave of providers was released and after 3.2.0rc2 was prepared for voting. https://github.com/apache/airflow/actions/runs/24048921103/job/70144926268#step:11:2547 Most likely - if you would really like to understand why you might dig deeper - you will find that some of the updated provider's previous version held it back. Most likely amazon provider. We had some dependencies updated recently there for improved Python 3.14 compatibilty and likely that one freed numpy from upgrading. Python dependencies (and dependencies in general) and resolving them is a very complex task that we delegate to uv to resolve and actively work on freeing some of the versions that are held by other dependencies. We even have this nice tool I developed to tell us why we cannot upgrade something and when you run it v3-2-test it nicely shows we are not held back any more (but it was) <img width="1569" height="398" alt="Image" src="https://github.com/user-attachments/assets/093a4a32-460a-4d21-8e3c-630c7d814349" /> So no - this is not a bug - just timing of preparation of the image and constraints. And just a comment for you - it's likely fine (in case your combination of providers) - to **not** use constraints when you are using your requirements - you are not limited by constraints. Constraints (as clearly explained in our docs: https://airflow.apache.org/docs/apache-airflow/stable/installation/installing-from-pypi.html#constraints-files are not "requirements" - they are constraints and they are useful to do a reproducible installation of airlfow and our container image - that's about it. And there is absolutely no guarantee that you should not use compilers for that - many of our dependencies need compilation step at various Python versions or systems. So there is absolutely no guarantee compilation is not needed when you install airflow. This is not airflow - this is how python installation works. And if you install it in an environment where you have no compiler, you were simply lucky that you have not needed it for anything else. But it's neither expected nor guaran teed - especially if you choose to build your own image from scratch. Also - you are absolutely not blocked. This is explained in our documentation https://airflow.apache.org/docs/apache-airflow/stable/installation/installing-from-pypi.html#using-your-own-constraints - you can download airflow constraints, and modify them in the way that you see fit and use them instead of airflow constraints - for example in this case updating numpy version, testing it, and using your own constraints file. It should be pretty easy to modify your Dockerfile to handle the case - even if it is exceptional. We could theorethically regenrate those constraints - but since we have not tested it during release, we cannot guarantee it works - so us updating the constraints is not really an option. There are very good reasons why we are fixing constraints at release time and we are updating them only in very exceptiona situations - when airflow stops instaling at all due to setuptools release or some other "breaking" events. This is explained here - what's the reasoning and why we will not do when single user (or even few) has problem they can very easily workaround: https://airflow.apache.org/docs/apache-airflow/stable/installation/installing-from-pypi.html#fixing-constraints-at-release-time Also the reasons we actuallly regenerated constraints and rebuild our images are summarized here: https://airflow.apache.org/docs/docker-stack/changelog.html#changes-after-publishing-the-images There were 9 such events since 2022. Re-generating constraints and rebuilding the image is extra effort for maintainers and we only do that when our users are completely blocked and have no known workarounds to fix serious installation problems with Airflow. This is certainly not the case - you have ways to workaround your problem, Summarising: * Numpy in constraints is **working** in Python 3.14 - and some of our providers kept it to lower version when we released Airflow. * It will be upgraded in the future release when the condition that held it was removed * you have ways how to install newer version of Numpy for Python 3.14 I hope this explanation is sufficient? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
