> releases in pip should have stable (pinned deps) I think that's an issue. When setup.py (the only reqs that setuptools/pip knows about) is restrictive, there's no way to change that in your environment, install will just fail if you deviate (are there any hacks/solutions around that that I don't know about???). For example if you want a specific version of pandas in your env, and Airflow's setup.py has another version of pandas pinned, you're out of luck. I think the only way is to fork and make you own build at that point as you cannot alter setup.py once it's installed. On the other hand, when a version range is specified in setup.py, you're free to pin using your own reqs.txt within the specified version range.
I think pinning in setup.py is just not viable. setup.py should have version ranges based semantic versioning expectations. (lib>=1.1.2, <2.0.0). Personally I think we should always have 2 bounds based on either 1-semantic versioning major release, or 2- a lower version than prescribed by semver that we know breaks backwards compatibility features we require. I think we have consensus around something like pip-tools to generate a "deterministic" `requirements.txt`. A caveat is we may need 2: requirements.txt and requirements3.txt for Python 3 as some package versions can be flagged as only py2 or only py3. Max On Fri, Oct 19, 2018 at 1:47 AM Jarek Potiuk <jarek.pot...@polidea.com> wrote: > I think i might have a proposal that could be acceptable by everyone in the > discussion (hopefully :) ). Let me summarise what I am leaning towards > now: > > I think we can have a solution where it will be relatively easy to keep > both "open" and "fixed" requirements (open in setup.py, fixed in > requirements.txt). Possibly we can use pip-tools or poetry (including using > of the poetry-setup <https://github.com/orsinium/poetry-setup> which seem > to be able to generate setup.py/constraints.txt/requirements.txt from > poetry setup). Poetry is still "new" so it might not work, then we can try > to get similar approach with pip-tools or our own custom solution. Here are > the basic assumptions: > > - we can leave master with "open" requirements which makes it > potentially unstable with potential conflicting dependencies. We will > also > document how to generate stable set of requirements (hopefully > automatically) and a way how to install from master using those. *This > addresses needs of people using master for active development with > latest > libraries.* > - releases in pip should have stable (pinned deps). Upgrading pinned > releases to latest "working" stable set should be part of the release > process (possibly automated with poetry). We can try it out and decide > if > we want to pin only direct dependencies or also the transitive ones (I > think including transitive dependencies is a bit more stable). *This way > we keep long-term "install-ability" of releases and make job of release > maintainer easier*. > - CI builds will use the stable dependencies from requirements.txt. > *This > way we keep CI from dependency-triggered failures.* > - we add documentation on how to use pip --constraints mechanism by > anyone who would like to use airflow from PIP rather than sources, but > would like also to use other (up- or down- graded) versions of specific > dependencies. *This way we let active developers to work with airflow > and more recent/or older releases.* > > If we can have general consensus that we should try it, I might try to find > some time next week to do some "real work". Rather than implement it and > make a pull request immediately, I think of a Proof Of Concept branch > showing how it would work (with some artificial going back to older > versions of requirements). I thought about pre-flaskappbuilder upgrade in > one commit and update to post-flaskappbuilder upgrade in second, explaining > the steps I've done to get to it. That would be much better for the > community to discuss if that's the right approach. > > Does it sound good ? > > J. > > On Wed, Oct 17, 2018 at 2:21 AM Daniel (Daniel Lamblin) [BDP - Seoul] < > lamb...@coupang.com> wrote: > > > On 10/17/18, 12:24 AM, "William Pursell" <willi...@wepay.com.INVALID> > > wrote: > > > > I'm jumping in a bit late here, and perhaps have missed some of the > > discussion, but I haven't seen any mention of the fact that pinning > > versions in setup.py isn't going to solve the problem. Perhaps it's > > my lack of experience with pip, but currently pip doesn't provide any > > guarantee that the version of a dependency specified in setup.py will > > be the version that winds up being installed. Is this a known issue > > that is being intentionally ignored because it's hard (and out of > > scope) to solve? I agree that versions should be pinned in setup.py > > for stable releases, but I think we need to be aware that this won't > > solve the problem. > > > > So the problem is going to be stubborn for the rare user not installing > > into a clean venv, vm, or docker image, or who is not relying on pypi to > > host the dependencies unmodified. > > https://pip.pypa.io/en/stable/user_guide/#pinned-version-numbers > > That doesn't mean it doesn't fix it for the vast majority of users who > are > > trying to install a particular supported stable release. Given that > 1.10.0 > > is the absolute very latest release, it should be supported. > > > > Shouldn’t there be an expectation that installing on a clean system from > a > > supported stable branch will create a stable installation that can run > the > > release? > > > > > > > > -- > > *Jarek Potiuk, Principal Software Engineer* > Mobile: +48 660 796 129 >