Thank you for this response. It took me quite some times to carefully read
it, it was far beyond my expectation! So thanks a lot, lot of thing to

>> There are actually three different relevant use cases here, with some
> patterns available to draw from. I'm going to spell them out to just make
> sure we're on the same page.
> * Library
> * Application
> * Suite of Coordinated Applications


Can we say packaging a python application for a linux distribution may fall
in this "Suite of Coordinate Applications" kind?

I really liked your way of describing these differences, I actually started
using it as basis for my internal Python courses I give in my company on a
regular basis :)

As maintainer of a small project (Guake), I am in contact with package
maintainers (Debian, arch, now, fedora,...) that relates me these kind of
issues (Guake mainly has system python dependencies such as pygtk). They
(the maintainers) has similar issues as OpenStack might have, in that they
need to sync all dependencies for a given version distribution. Of course
they do not use this upper-constraint requirements file as a way of fixing
the same dependencies for all apps.

For Pipfile, I believe we'd want to see is adding support for --constraints
> to pipenv install - so that we can update our Pipfile.lock file for each
> application in the context of the global constraints file. This can be
> simulated today without any support from pipenv directly like this:
>   pipenv install
>   $(pipenv --venv)/bin/pip install -U -c
> https://git.openstack.org/cgit/openstack/requirements/plain/upper-constraints.txt
> -r requirements.txt
>   pipenv lock

the pipenv lock will not lock the dependencies added manually by pip,
unfortunately. What you can do it open a bug on pipenv project [1] on my
behalf, with your need and assign me on it. I am not sure how this would
need to be handled, but this worth notifying the pipenv projects of your
needs (never used the -c option of pip install)

> There is also works on PEP around pyproject.toml ([4]), which looks quite
>> similar to PBR's setup.cfg. What do you think about it?
> It's a bit different. There is also a philosophical disagreement about the
> use of TOML that's not worth going in to here

we agree on TOML

> - but from a pbr perspecitve I'd like to minimize use of pyproject.toml to
> the bare minimm needed to bootstrap things into pbr's control. In the first
> phase I expect to replace our current setup.py boilerplate:
> setuptools.setup(
>     setup_requires=['pbr'],
>     pbr=True)
> with:
> setuptool.setup(pbr=True)
> and add pyproject.toml files with:
> [build-system]
> requires = ["setuptools", "wheel", "pbr"]

That would would indeed way beyond all what I wanted to work on for pbr.
Do you have a plan for this support for pbr?

> My opinion is this difference in behaviourbetween lib and app has
>> technical reasons, but as a community we would gain a lot of unifying both
>> workflows. I am using PBR + a few hacks [5], and I am pretty satisfied with
>> the overall result.
> There are two topics your pbr patch opens up that need to be covered:
> * pbr behavior
> * dependencies
> ** pbr behavior **
> I appreciate what you're saying about unifying the lib and app workflow,
> but I think the general pattern across the language communities (javascript
> and rust both have similar patterns to Pipefile) is that the two different
> options are important.

We may just need to have a better document - rust has an excellent
> description:
> https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html

There are lot of talks in the pipenv community at this moment, we ended up
with best practices for python:
- app: Pipfile and Pipfile.lock tracked in git
- library: do not track Pipfile*, but use ‘pipenv install -e .’

I am trying to amend it with pbr support for libs (and there are actually
several persons that uses pbr).

In any case, I think what pbr should do with pipfiles is:
> * If pbr discovers a Pipfile and no Pipfile.lock, it should treat the
> content in the packages section of Pipfile as it currently does with
> requirements.txt (and how you have done in the current patch)

That’s my goal in my patch. If I understand how it works (please tell me if
i am wrong), on a distribution package/wheel, pbr looks for
requirements.txt and inject as dependencies. The major change on my patch
- having to vendor the basic toml parser.

* If pbr discoves a Pipfile.lock, it should treat the content in
> Pipfile.lock as it currently does requirements.txt.

This is actually the “normal” behavior of pipenv, and more generally, of
the Pipfile parser: prefer the lock file over the pipenv. The ultimate
source of truth is the lock file and the Pipfile is only a convinient way
of describing the dependencies, at least that is for apps.

Then, we either need to:
> * Add support to pipfile install for specifying a pip-style constraints
> file

Not sure if I’ll be the right dev to handle the constraint feature...

* Add support to pipfile install for specifying a constraints file that is
> in the format of a Pipfile.lock - but which does the same thing.
> * Write a pbr utility subcommand for generating a Pipfile.lock from a
> Pipfile taking a provided constraints file into account.

Ok, you mean, as OpenStack will keep the requirements constraint, pbr needs
to support it to be compatible. Pipenv is already able to support read from
requirement, but it reflect on the Pipfile (that’s probably not wanted)

We may also want to write a utility for creating a Pipefile and/or lock
> from a pbr-oriented requirements.txt/test-requirements.txt. (it should use
> pipfile on the backend of course) that can do the appropriate initial dance.

Do you mean a constraint file or any requirements.txt? Pipenv is already
able to convert a requirements.txt to Pipfile.

** dependencies **
> The pep518 support in pip10 is really important here. Because ...
> We should not vendor code into pbr.

Oops, I was exactly thinking this would cause problem. I actually kind of
vendor 2 deps, “pipfile” parser and “toml parser”.

While vendoring code has been found to be acceptable by other portions of
> the Python community, it is not acceptable here.

There are intense talks on this vendoring issues on pipenv, for example
some deps are actually vendored 3 times (request if I remember).

So, I agree this is not acceptable, but I need to better understand how pip
10 works.

Once pip10 is released next week with pyproject.toml support, as mentioned
> earlier, we'll be able to start using (a reasonable set) of dependencies as
> is appropriate. In order to ensure backwards compat, I would recommend we
> do the following:
> * Add toml and pipfile as depends to pbr
> * Protect their imports with a try/except (for people using old pip which
> won't install any depends pbr has)
> * Declare that pbr support for Pipfile will only work with people using
> pip>=10 and for projects that add a pyproject.toml to their project
> containing
>   [build-system]
>   requires = ["pbr"]
> * If pbr tries to import toml/pipfile and fails, it should fall back to
> reading requirements.txt (this allows us to keep backwards compat until
> it's reasonable to expect everyone to be on pip10)

Seems way much more what I was expecting for the Pipfile support, but if
you (the openstack community and pbr dévelopers) can guide me on this I
think I can work on this full pip10+Pipfile support.

The thing is, for the moment the current situation works pretty well for
us, pbr does a marvelous work on both our libraries (but require to
generate the requirement.twt with pipenv-to-requirements).
I started using reno and its Sphinx plug-in (see the Guake source code :)

To support that last point, we should write a utility function, let's call
> it 'pbr lock', with the following behavior:
> * If a Pipfile and a Pipfile.lock are present, it runs:
>   pipfile lock -r

Not sure why it would be needed, if the project uses pipenv the normal way
is to use pipenv lock directly to generate Pipfile.lock.
I am not sure to understand the need for generating the requirements.txt
with pipenv lock -r? Is it to support projects that uses Pipfile but on an
old pip (<10?) is there reason not to update pip to the latest version ?

* If there is no Pipfile.lock, simply read the Pipfile and write the
> specifiers into requirements.txt in non-pinned format.
> This will allow pbr users to maintain their projects in such a way as to
> be backwards compatible while they start to use Pipfile/Pipefile.lock
> We MAY want to consider adding an option flag to setup.cfg, like:
>   [pbr]
>   type = application
> or
>   [pbr]
>   type = library
> for declaring to pbr which of Pipfile / Pipfile.lock should pbr pay
> attention to, regardless of which files might be present.

I really liked this feature, may I add another setting in pbr: allow or
disallow v-version.
I have added support for version with a “v” or “V” prefix, that works
really great with Gitlab’s protected tag regular expression.
I would like my project for example to fail if one create a tag on master
without the v prefix. It is not mandatory, but when the project will be
handed over to new maintainer, they may do the mistake of using a version
without v/V and it will work silently.

I'm not sure whether that would be better or worse than inferring behavior
> from the presence of files.

I like the convention over configuration approach, but still having a way
of describing our wanted behavior literally in a config file is great also !

Thanks for this very interesting answer, hope to have lot of other great
exchange like this !


[1] pipenv: https://github.com/pypa/pipenv

OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe

Reply via email to