Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Glyph Lefkowitz

> On Feb 15, 2017, at 11:44 AM, Donald Stufft  wrote:
> 
> 
>> On Feb 15, 2017, at 1:15 PM, Daniel Holth > > wrote:
>> 
>> I also get a little frustrated with this kind of proposal "no pins" which I 
>> read as "annoy the publisher to try to prevent them from annoying the 
>> consumer". As a free software publisher I feel entitled to annoy the 
>> consumer, an activity I will indulge in inversely proportional to my desire 
>> for users. Who is the star?
>> 
>> It should be possible to publish applications to pypi. Much of the packaging 
>> we have is completely web application focused, these applications are not 
>> usually published at all.
>> 
> 
> 
> 
> I haven’t fully followed this thread, and while the recommendation is and 
> will always be to use the least strict version specifier that will work for 
> your application, I am pretty heavily -1 on mandating that people do not use 
> ``==``. I am also fairly heavily -1 on confusing the data model even more by 
> making two sets of dependencies, one that allows == and one that doesn’t. 

I hope I'm not repeating a suggestion that appears up-thread, but, if you want 
to distribute an application with pinned dependencies, you could always 
released 'foo-lib' with a lenient set of dependencies, and 'foo-app' which 
depends on 'foo-lib' but pins the transitive closure of all dependencies with 
'=='.  Your CI system could automatically release a new 'foo-app' every time 
any dependency has a new release and a build against the last release of 
'foo-app' passes.

-glyph

___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Python installation not working

2017-02-15 Thread Eric Brunson
h...@python.org is also set up to provide this kind of assistance.

On Wed, Feb 15, 2017 at 10:05 AM Brett Cannon  wrote:

> This actually isn't the right place to ask for installation help, Chitra
> (this list is about how to package up Python projects). For general support
> questions you should email python-list.
>
> On Wed, 15 Feb 2017 at 05:11 Chitra Dewan via Distutils-SIG <
> distutils-sig@python.org> wrote:
>
> Hello,
>
> I am *beginner in Python*
> I am facing problems in installing Python 3.5  on my windows vista x32
> machine.
> I downloaded python-3.5.2.exe from Python.org. It is downloaded as an
> exe. When I try to install it via  "Run as administrator" , nothing
> happens.  Same behavior with 3.6 version
>
> kindly advise
>
>
> Regards & Thanks, Chitra Dewan
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Donald Stufft

> On Feb 15, 2017, at 1:15 PM, Daniel Holth  wrote:
> 
> I also get a little frustrated with this kind of proposal "no pins" which I 
> read as "annoy the publisher to try to prevent them from annoying the 
> consumer". As a free software publisher I feel entitled to annoy the 
> consumer, an activity I will indulge in inversely proportional to my desire 
> for users. Who is the star?
> 
> It should be possible to publish applications to pypi. Much of the packaging 
> we have is completely web application focused, these applications are not 
> usually published at all.
> 



I haven’t fully followed this thread, and while the recommendation is and will 
always be to use the least strict version specifier that will work for your 
application, I am pretty heavily -1 on mandating that people do not use ``==``. 
I am also fairly heavily -1 on confusing the data model even more by making two 
sets of dependencies, one that allows == and one that doesn’t. I don’t think 
that overly restrictive pins is that common of a problem (if anything, we’re 
more likely to have too loose of pins, due to the always-upgrade nature of pip 
and the difficulty of exhaustivly testing every possible version combination).

In cases where this actively harms the end user (effectively when there is a 
security issue or a conflict) we can tell the user about it (theoretically, not 
in practice yet) but beyond that, this is best handled by opening individual 
issues up on each individual repository, just like any other packaging issue 
with that project.

—
Donald Stufft



___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Daniel Holth
I also get a little frustrated with this kind of proposal "no pins" which I
read as "annoy the publisher to try to prevent them from annoying the
consumer". As a free software publisher I feel entitled to annoy the
consumer, an activity I will indulge in inversely proportional to my desire
for users. Who is the star?

It should be possible to publish applications to pypi. Much of the
packaging we have is completely web application focused, these applications
are not usually published at all.

On Wed, Feb 15, 2017 at 12:58 PM Paul Moore  wrote:

> Thanks for your reply, it was very helpful.
>
> On 15 February 2017 at 16:55, Freddy Rietdijk 
> wrote:
> > Larger applications that have many dependencies that are fixed have been
> > kept out of Nixpkgs for now.
>
> I notice here (and in a few other places) you talk about
> "Applications". From what I understand of Nick's position,
> applications absolutely should pin their dependencies - so if I'm
> understanding correctly, those applications will (and should) continue
> to pin exact versions.
>
> As regards automatic packaging of new upstream versions (of libraries
> rather than applications), I guess if you get upstream to remove the
> pinned versions, this problem goes away.
>
> > The main problem I see is that it limits in how far you can
> automatically update to newer versions and thus release bug/security fixes.
> Just one inappropriate pin is sufficient to break dependency solving.
>
> I'm not sure I follow this. Suppose we have foo 1.0 depending on bar.
> If foo 1.0 has doesn't pin bar (possibly because you reported to them
> that they shouldn't) then foo 1.1 isn't going to suddenly add the pin
> back. So you can update foo fine. And you can update bar because
> there's no pin. So yes, while "one inappropriate pin" can cause a
> problem, getting upstream to fix that is a one-off cost, not an
> ongoing issue.
>
> So, in summary,
>
> * I agree that libraries pinning dependencies too tightly is bad.
> * Distributions can easily enough report such pins upstream when the
> library is initially packaged, so there's no ongoing cost here (just
> possibly a delay before the library can be packaged).
> * Libraries can legitimately have appropriate pins (typically to
> ranges of versions). So distributions have to be able to deal with
> that.
> * Applications *should* pin precise versions. Distributions have to
> decide whether to respect those pins or remove them and then take on
> support of the combination that upstream doesn't support.
> * But application pins should be in a requirements.txt file, so
> ignoring version specs is pretty simple (just a script to run against
> the requirements file).
> * Because Python doesn't support multiple installed versions of
> packages, conflicting requirements *will* be a problem that distros
> have to solve themselves (the language response is "use a venv").
>
> Nick is suggesting that the requirement metadata be prohibited from
> using exact pins, but there's alternative metadata for "yes, I really
> mean an exact pin". To me:
>
> 1. This doesn't have any bearing on *application* pins, as they aren't
> in metadata.
> 2. Distributions still have to be able to deal with libraries having
> exact pins, as it's an explicitly supported possibility.
> 3. You can still manage (effectively) exact pins without being
> explicit - foo >1.6,<1.8 pretty much does it. And that doesn't even
> have to be a deliberate attempt to break the system, it could be a
> genuine attempt to avoid known issues, that just got too aggressive.
>
> So we're left with additional complexity for library authors to
> understand, for what seems like no benefit in practice to distribution
> builders. The only stated benefit of the 2 types of metadata is to
> educate library authors of the benefits of not pinning versions - and
> it seems like a very sweeping measure, where bug reports from
> distributions seem like they would be a much more focused and just as
> effective approach.
>
> Paul
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nathaniel Smith
On Feb 15, 2017 07:41, "Nick Coghlan"  wrote:

>> pipenv borrows the Ruby solution to modeling this by having Pipfile
>> for abstract dependency declarations and Pipfile.lock for concrete
>> integration testing ones, so the idea here is to propagate that model
>> to pydist.json by separating the "requires" field with abstract
>> development dependencies from the "integrates" field with concrete
>> deployment dependencies.
>
> What's the benefit of putting this in pydist.json? I feel like for the
> usual deployment cases (a) going straight from Pipfile.lock -> venv is
> pretty much sufficient, with no need to put this into a package, but
> (b) if you really do want to put it into a package, then the natural
> approach would be to make an empty wheel like
> "my-django-app-deploy.whl" whose dependencies were the contents of
> Pipfile.lock.

My goal with the split is to get to a state where:

- exactly zero projects on PyPI use "==" or "===" in their requires
metadata (because PyPI explicitly prohibits it)
- the vast majority of projects on PyPI *don't* have an "integrates" section
- those projects that do have an `integrates` section have a valid
reason for it (like PyObjC)

For anyone making the transition from application and web service
development to library and framework development, the transition from
"always pin exact versions of your dependencies for deployment" to
"when publishing a library or framework, only rule out the
combinations that you're pretty sure *won't* work" is one of the
trickiest to deal with as current tools *don't alert you to the fact
that there's a difference to be learned*.

Restricting what can go into requires creates an opportunity to ask
users whether they're publishing a pre-integrated project or not: if
yes, then they add the "integrates" field and put their pinned
dependencies there; if not, then they relax the "==" constraints to
"~=" or ">=".


Ah-hah, this does make sense as a problem, thanks!

However, your solution seems very odd to me :-).

If the goal is to put an "are you sure/yes I'm sure" UX barrier between
users and certain version settings, then why make a distinction that every
piece of downstream software has to be aware of and ignore? Pypi seems like
a funny place in the stack to be implementing this. It would be much
simpler to implement this feature at the build system level, like e.g.
setuptools could require that dependencies that you think are over strict
be specified in an install_requires_yes_i_really_mean_it= field, without
requiring any metadata changes.

Basically it sounds like you're saying you want to extend the metadata so
that it can represent both broken and non-broken packages, so that both can
be created, passed around, and checked for. And I'm saying, how about
instead we do that checking when creating the package in the first place.

(Of course I can't see any way to do any of this that won't break existing
sdists, but I guess you've already decided you're OK with that. I guess I
should say that I'm a bit dubious that this is so important in the first
place; I feel like there are lots of legitimate use cases for ==
dependencies and lots of kinds of linting we might want to apply to try and
improve the level of packaging quality.)


Either way, PyPI will believe your answer, it's just refusing the
temptation to guess that using "==" or "===" in the requires section
is sufficient to indicate that you're deliberately publishing a
pre-integrated project.

> There's certainly a distinction to be made between the abstract
> dependencies and the exact locked dependencies, but to me the natural
> way to model that distinction is by re-using the distinction we
> already have been source packages and binary packages. The build
> process for this placeholder wheel is to "compile down" the abstract
> dependencies into concrete dependencies, and the resulting wheel
> encodes the result of this compilation. Again, no new concepts needed.

Source vs binary isn't where the distinction applies, though. For
example, it's legitimate for PyObjC to have pinned dependencies even
when distributed in source form, as it's a metapackage used solely to
integrate the various PyObjC subprojects into a single "release".


?? So that means that some packages have a loosely specified source that
compiles down to a more strictly specified binary, and some have a more
strictly specified source that compiles down to an equally strictly
specified binary. That's... an argument in favor of my way of thinking
about it, isn't it? That it can naturally express both situations?

My point is that *for the cases where there's an important distinction
between Pipfile and Pipfile.lock*, we already have a way to think about
that distinction without introducing new concepts.

-n
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Paul Moore
Thanks for your reply, it was very helpful.

On 15 February 2017 at 16:55, Freddy Rietdijk  wrote:
> Larger applications that have many dependencies that are fixed have been
> kept out of Nixpkgs for now.

I notice here (and in a few other places) you talk about
"Applications". From what I understand of Nick's position,
applications absolutely should pin their dependencies - so if I'm
understanding correctly, those applications will (and should) continue
to pin exact versions.

As regards automatic packaging of new upstream versions (of libraries
rather than applications), I guess if you get upstream to remove the
pinned versions, this problem goes away.

> The main problem I see is that it limits in how far you can automatically 
> update to newer versions and thus release bug/security fixes. Just one 
> inappropriate pin is sufficient to break dependency solving.

I'm not sure I follow this. Suppose we have foo 1.0 depending on bar.
If foo 1.0 has doesn't pin bar (possibly because you reported to them
that they shouldn't) then foo 1.1 isn't going to suddenly add the pin
back. So you can update foo fine. And you can update bar because
there's no pin. So yes, while "one inappropriate pin" can cause a
problem, getting upstream to fix that is a one-off cost, not an
ongoing issue.

So, in summary,

* I agree that libraries pinning dependencies too tightly is bad.
* Distributions can easily enough report such pins upstream when the
library is initially packaged, so there's no ongoing cost here (just
possibly a delay before the library can be packaged).
* Libraries can legitimately have appropriate pins (typically to
ranges of versions). So distributions have to be able to deal with
that.
* Applications *should* pin precise versions. Distributions have to
decide whether to respect those pins or remove them and then take on
support of the combination that upstream doesn't support.
* But application pins should be in a requirements.txt file, so
ignoring version specs is pretty simple (just a script to run against
the requirements file).
* Because Python doesn't support multiple installed versions of
packages, conflicting requirements *will* be a problem that distros
have to solve themselves (the language response is "use a venv").

Nick is suggesting that the requirement metadata be prohibited from
using exact pins, but there's alternative metadata for "yes, I really
mean an exact pin". To me:

1. This doesn't have any bearing on *application* pins, as they aren't
in metadata.
2. Distributions still have to be able to deal with libraries having
exact pins, as it's an explicitly supported possibility.
3. You can still manage (effectively) exact pins without being
explicit - foo >1.6,<1.8 pretty much does it. And that doesn't even
have to be a deliberate attempt to break the system, it could be a
genuine attempt to avoid known issues, that just got too aggressive.

So we're left with additional complexity for library authors to
understand, for what seems like no benefit in practice to distribution
builders. The only stated benefit of the 2 types of metadata is to
educate library authors of the benefits of not pinning versions - and
it seems like a very sweeping measure, where bug reports from
distributions seem like they would be a much more focused and just as
effective approach.

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Jim Fulton
On Wed, Feb 15, 2017 at 11:55 AM, Freddy Rietdijk 
wrote:

> > Sort of repeating my earlier question, but how often does this happen
> in reality?
>
> From a quick check in our repo we have patched about 1% of our packages to
> remove the constraints. We have close to 2000 Python packages. We don't
> necessarily patch all the constraints, only when they collide with the
> version we would like the package to use so the actual percentage is likely
> higher.
>
> Larger applications that have many dependencies that are fixed have been
> kept out of Nixpkgs for now. Their fixed dependencies means we likely need
> multiple versions of packages. While Nix can handle that, it means more
> maintenance. We have a tool that can take e.g. a requirements.txt file and
> generate expressions, but it won't help you much with bug-fix releases when
> maintainers don't update their pinned requirements.
>

I suppose this isn't a problem for Java applications, which use jar files
and per-application class paths.

Jim

-- 
Jim Fulton
http://jimfulton.info
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] Python installation not working

2017-02-15 Thread Brett Cannon
This actually isn't the right place to ask for installation help, Chitra
(this list is about how to package up Python projects). For general support
questions you should email python-list.

On Wed, 15 Feb 2017 at 05:11 Chitra Dewan via Distutils-SIG <
distutils-sig@python.org> wrote:

> Hello,
>
> I am *beginner in Python*
> I am facing problems in installing Python 3.5  on my windows vista x32
> machine.
> I downloaded python-3.5.2.exe from Python.org. It is downloaded as an
> exe. When I try to install it via  "Run as administrator" , nothing
> happens.  Same behavior with 3.6 version
>
> kindly advise
>
>
> Regards & Thanks, Chitra Dewan
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Freddy Rietdijk
> Sort of repeating my earlier question, but how often does this happen
in reality?

>From a quick check in our repo we have patched about 1% of our packages to
remove the constraints. We have close to 2000 Python packages. We don't
necessarily patch all the constraints, only when they collide with the
version we would like the package to use so the actual percentage is likely
higher.

Larger applications that have many dependencies that are fixed have been
kept out of Nixpkgs for now. Their fixed dependencies means we likely need
multiple versions of packages. While Nix can handle that, it means more
maintenance. We have a tool that can take e.g. a requirements.txt file and
generate expressions, but it won't help you much with bug-fix releases when
maintainers don't update their pinned requirements.

> And how often is it that a simple request/PR to the package author to
remove
the explicit version requirements is rejected?

That's hard to say. If I look at what packages I've contributed to Nixpkgs,
then in my experience this is something that is typically dealt with by
upstream when asked.

> If you *do* get in a situation where the package explicitly requires
certain versions of its dependencies, and you ignore those
requirements, then presumably you're taking responsibility for
supporting a combination that the upstream author doesn't support. How
do you handle that?

Typical situations are bug-fix releases. So far I haven't encountered any
issues with using other versions, but like I said, larger applications that
pin their dependencies have been mostly kept out of Nixpkgs. If we do
encounter issues, then we have to find a solution. The likeliest situation
is that an application requires a different version, an in that case we
would then have an expression/package of that version specifically for that
application. We don't have a global site-packages so we can do that.

> Nick said he wants to guide authors away from explicit version
pinning. That's fine, but is the problem so big that the occasional
bug report to offending projects saying "please don't pin exact
versions" is insufficient guidance?

The main problem I see is that it limits in how far you can automatically
update to newer versions and thus release bug/security fixes. Just one
inappropriate pin is sufficient to break dependency solving.


On Wed, Feb 15, 2017 at 5:14 PM, Paul Moore  wrote:

> On 15 February 2017 at 15:50, Freddy Rietdijk 
> wrote:
> > It's quite frustrating as a downstream having to deal with packages where
> > versions are pinned unnecessarily and therefore I've also requested on
> the
> > Setuptools tracker a flag that ignores constraints [1] (though I fear I
> > would have to pull up my sleeves for this one :) ) .
>
> Sort of repeating my earlier question, but how often does this happen
> in reality? (As a proportion of the packages you deal with). And how
> often is it that a simple request/PR to the package author to remove
> the explicit version requirements is rejected? (I assume your first
> response is to file an issue with upstream?)
>
> If you *do* get in a situation where the package explicitly requires
> certain versions of its dependencies, and you ignore those
> requirements, then presumably you're taking responsibility for
> supporting a combination that the upstream author doesn't support. How
> do you handle that?
>
> I'm not trying to pick on your process here, or claim that
> distributions are somehow doing things wrongly. But I am trying to
> understand what redistributors' expectations are of package authors.
> Nick said he wants to guide authors away from explicit version
> pinning. That's fine, but is the problem so big that the occasional
> bug report to offending projects saying "please don't pin exact
> versions" is insufficient guidance?
>
> Paul
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Paul Moore
On 15 February 2017 at 15:50, Freddy Rietdijk  wrote:
> It's quite frustrating as a downstream having to deal with packages where
> versions are pinned unnecessarily and therefore I've also requested on the
> Setuptools tracker a flag that ignores constraints [1] (though I fear I
> would have to pull up my sleeves for this one :) ) .

Sort of repeating my earlier question, but how often does this happen
in reality? (As a proportion of the packages you deal with). And how
often is it that a simple request/PR to the package author to remove
the explicit version requirements is rejected? (I assume your first
response is to file an issue with upstream?)

If you *do* get in a situation where the package explicitly requires
certain versions of its dependencies, and you ignore those
requirements, then presumably you're taking responsibility for
supporting a combination that the upstream author doesn't support. How
do you handle that?

I'm not trying to pick on your process here, or claim that
distributions are somehow doing things wrongly. But I am trying to
understand what redistributors' expectations are of package authors.
Nick said he wants to guide authors away from explicit version
pinning. That's fine, but is the problem so big that the occasional
bug report to offending projects saying "please don't pin exact
versions" is insufficient guidance?

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Thomas Kluyver
On Wed, Feb 15, 2017, at 03:40 PM, Daniel Holth wrote:

> It would make sense to go ahead and delete the obsolete fields, I'm
> sure they were overlooked because they are not common in the wild.
> 

> From PEP 345:

>  * Deprecated fields:
>* Requires (in favor of Requires-Dist)
>* Provides (in favor of Provides-Dist)
For reference, packages made with flit do use 'Provides' to indicate the
name of the importable module or package that the distribution installs.
This seems to me to be something worth exposing - in another thread,
we're discussing downloading and scanning packages to get this
information. But I accept that it's not very useful while only a tiny
minority of packages do it.


Thomas
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Freddy Rietdijk
> Maybe it would help if you have a concrete example of a
scenario where they would benefit from having this distinction?

In the Nix package manager (source distribution with binary substitutes)
and Nixpkgs package set we typically require the filename and hash of a
package. In our expressions we typically pass an url (that includes the
name), and the hash. The url is only needed when the file isn't in our
store. This is convenient, because if an url is optional this allows you to
pre-fetch or work with mirrors. All we care about is that we get the file,
not how it is provided. This applies for source archives, but behind the
scenes also for binary substitutes. With Nix, functions build a package,
and dependencies are passed as function arguments with names that
typically, but not necessarily, resemble the dependency name.

Now, a function that builds a package, a package builder, only needs to be
provided with abstract dependencies; it just needs to know what it should
look for, "we need 'a' numpy, 'a' scipy, 'a compiler that has a certain
interface and can do this job'", etc.. Version numbers can help in order to
fail prematurely, but generally only bounds, not a pinned value. Its up to
another tool to provide the builder with the actual packages, the concrete
dependencies to the builder. And this tool might fetch it from PyPI, or
from GitHub, or...

The same goes for building, distributing and installing Python packages.
Setuptools shouldn't bother with versions (except the constraints in case
of libraries) or wherever a source comes from but just build or fail. Pip
should just fetch/resolve and pass concrete dependencies to whatever
builder (Setuptools, Flit), or whatever environment (virtualenv) needs it.

It's quite frustrating as a downstream having to deal with packages where
versions are pinned unnecessarily and therefore I've also requested on the
Setuptools tracker a flag that ignores constraints [1] (though I fear I
would have to pull up my sleeves for this one :) ) .

[1] https://github.com/pypa/setuptools/issues/894

On Wed, Feb 15, 2017 at 3:11 PM, Nathaniel Smith  wrote:

> On Wed, Feb 15, 2017 at 5:27 AM, Nick Coghlan  wrote:
> > On 15 February 2017 at 12:58, Nathaniel Smith  wrote:
> >> On Wed, Feb 15, 2017 at 3:33 AM, Nick Coghlan 
> wrote:
> >>> - "requires": list where entries are either a string containing a PEP
> >>> 508 dependency specifier or else a hash map contain a "requires" key
> >>> plus "extra" or "environment" fields as qualifiers
> >>> - "integrates": replacement for "meta_requires" that only allows
> >>> pinned dependencies (i.e. hash maps with "name" & "version" fields, or
> >>> direct URL references, rather than a general PEP 508 specifier as a
> >>> string)
> >>
> >> What's accomplished by separating these? I really think we should
> >> strive to have fewer more orthogonal concepts whenever possible...
> >
> > It's mainly a matter of incorporating
> > https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
> > data model, as this distinction between abstract development
> > dependencies and concrete deployment dependencies is incredibly
> > important for any scenario that involves
> > publisher-redistributor-consumer chains, but is entirely non-obvious
> > to folks that are only familiar with the publisher-consumer case that
> > comes up during development-for-personal-and-open-source-use.
>
> Maybe I'm just being dense but, umm. I don't know what any of these
> words mean :-). I'm not unfamiliar with redistributors; part of my
> confusion is that this is a concept that AFAIK distro package systems
> don't have. Maybe it would help if you have a concrete example of a
> scenario where they would benefit from having this distinction?
>
> > One particular area where this is problematic is in the widespread
> > advice "always pin your dependencies" which is usually presented
> > without the all important "for application or service deployment"
> > qualifier. As a first approximation:
> > pinning-for-app-or-service-deployment == good,
> > pinning-for-local-testing == good,
> > pinning-for-library-or-framework-publication-to-PyPI == bad.
> >
> > pipenv borrows the Ruby solution to modeling this by having Pipfile
> > for abstract dependency declarations and Pipfile.lock for concrete
> > integration testing ones, so the idea here is to propagate that model
> > to pydist.json by separating the "requires" field with abstract
> > development dependencies from the "integrates" field with concrete
> > deployment dependencies.
>
> What's the benefit of putting this in pydist.json? I feel like for the
> usual deployment cases (a) going straight from Pipfile.lock -> venv is
> pretty much sufficient, with no need to put this into a package, but
> (b) if you really do want to put it into a package, then the natural
> approach would be to make an empty wheel like
> "my-django-app-deploy.whl" whose 

Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Paul Moore
On 15 February 2017 at 15:41, Nick Coghlan  wrote:
> My goal with the split is to get to a state where:
>
> - exactly zero projects on PyPI use "==" or "===" in their requires
> metadata (because PyPI explicitly prohibits it)
> - the vast majority of projects on PyPI *don't* have an "integrates" section
> - those projects that do have an `integrates` section have a valid
> reason for it (like PyObjC)

So how many projects on PyPI currently have == or === in their
requires? I've never seen one (although my sample size isn't large -
but it does cover major packages in a large-ish range of application
areas).

I'm curious as to how major this problem is in practice. I (now)
understand the theoretical argument for the proposal.

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nick Coghlan
On 15 February 2017 at 15:58, Paul Moore  wrote:
> On 15 February 2017 at 14:11, Nathaniel Smith  wrote:
>>> It's mainly a matter of incorporating
>>> https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
>>> data model, as this distinction between abstract development
>>> dependencies and concrete deployment dependencies is incredibly
>>> important for any scenario that involves
>>> publisher-redistributor-consumer chains, but is entirely non-obvious
>>> to folks that are only familiar with the publisher-consumer case that
>>> comes up during development-for-personal-and-open-source-use.
>>
>> Maybe I'm just being dense but, umm. I don't know what any of these
>> words mean :-). I'm not unfamiliar with redistributors; part of my
>> confusion is that this is a concept that AFAIK distro package systems
>> don't have. Maybe it would help if you have a concrete example of a
>> scenario where they would benefit from having this distinction?
>
> I'm also finding this discussion bafflingly complex. I understand that
> distributions need a way to work with Python packages, but the issues
> involved seem completely divorced from the basic process of a user
> using pip to install a package with the dependencies it needs to work
> in their program.

As simple as I can make it:

* pinning dependencies when publishing to PyPI is presumptively bad
* PyPI itself (not client tools) should warn you that it's a bad idea
* however, there are legitimate use cases for pinning in PyPI packages
* so there should be a way to do it, but it should involve telling
PyPI "I am an integration project, this is OK"

Most people should never touch the "integrates" field, they should
just change "==" to "~=" or ">=" to allow for future releases of their
dependencies.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nick Coghlan
On 15 February 2017 at 15:11, Nathaniel Smith  wrote:
> On Wed, Feb 15, 2017 at 5:27 AM, Nick Coghlan  wrote:
>> It's mainly a matter of incorporating
>> https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
>> data model, as this distinction between abstract development
>> dependencies and concrete deployment dependencies is incredibly
>> important for any scenario that involves
>> publisher-redistributor-consumer chains, but is entirely non-obvious
>> to folks that are only familiar with the publisher-consumer case that
>> comes up during development-for-personal-and-open-source-use.
>
> Maybe I'm just being dense but, umm. I don't know what any of these
> words mean :-). I'm not unfamiliar with redistributors; part of my
> confusion is that this is a concept that AFAIK distro package systems
> don't have. Maybe it would help if you have a concrete example of a
> scenario where they would benefit from having this distinction?

It's about error messages and nudges in the UX: if PyPI rejects
version pinning in "requires" by default, then that creates an
opportunity to nudge people towards using "~=" or ">=" instead (as in
the vast majority of cases, that will be a better option than
pinning-for-publication).

The inclusion of "integrates" then adds back the support for
legitimate version pinning use cases in pydist.json in a way that
makes it clear that it is a conceptually distinct operation from a
normal dependency declaration.

>> pipenv borrows the Ruby solution to modeling this by having Pipfile
>> for abstract dependency declarations and Pipfile.lock for concrete
>> integration testing ones, so the idea here is to propagate that model
>> to pydist.json by separating the "requires" field with abstract
>> development dependencies from the "integrates" field with concrete
>> deployment dependencies.
>
> What's the benefit of putting this in pydist.json? I feel like for the
> usual deployment cases (a) going straight from Pipfile.lock -> venv is
> pretty much sufficient, with no need to put this into a package, but
> (b) if you really do want to put it into a package, then the natural
> approach would be to make an empty wheel like
> "my-django-app-deploy.whl" whose dependencies were the contents of
> Pipfile.lock.

My goal with the split is to get to a state where:

- exactly zero projects on PyPI use "==" or "===" in their requires
metadata (because PyPI explicitly prohibits it)
- the vast majority of projects on PyPI *don't* have an "integrates" section
- those projects that do have an `integrates` section have a valid
reason for it (like PyObjC)

For anyone making the transition from application and web service
development to library and framework development, the transition from
"always pin exact versions of your dependencies for deployment" to
"when publishing a library or framework, only rule out the
combinations that you're pretty sure *won't* work" is one of the
trickiest to deal with as current tools *don't alert you to the fact
that there's a difference to be learned*.

Restricting what can go into requires creates an opportunity to ask
users whether they're publishing a pre-integrated project or not: if
yes, then they add the "integrates" field and put their pinned
dependencies there; if not, then they relax the "==" constraints to
"~=" or ">=".

Either way, PyPI will believe your answer, it's just refusing the
temptation to guess that using "==" or "===" in the requires section
is sufficient to indicate that you're deliberately publishing a
pre-integrated project.

> There's certainly a distinction to be made between the abstract
> dependencies and the exact locked dependencies, but to me the natural
> way to model that distinction is by re-using the distinction we
> already have been source packages and binary packages. The build
> process for this placeholder wheel is to "compile down" the abstract
> dependencies into concrete dependencies, and the resulting wheel
> encodes the result of this compilation. Again, no new concepts needed.

Source vs binary isn't where the distinction applies, though. For
example, it's legitimate for PyObjC to have pinned dependencies even
when distributed in source form, as it's a metapackage used solely to
integrate the various PyObjC subprojects into a single "release".

>> In the vast majority of publication-to-PyPi cases people won't need
>> the "integrates" field, since what they're publishing on PyPI will
>> just be their abstract dependencies, and any warning against using
>> "==" will recommend using "~=" or ">=" instead. But there *are*
>> legitimate uses of pinning-for-publication (like the PyObjC
>> metapackage bundling all its subcomponents, or when building for
>> private deployment infastructure), so there needs to be a way to
>> represent "Yes, I'm pinning this dependency for publication, and I'm
>> aware of the significance of doing so"
>
> Why can't PyObjC just use regular dependencies? 

Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Daniel Holth
IIUC PEP 345, the predecessor of PEP 426, replaced Requires with
Requires-Dist because the former was never very well specified, easier to
re-name the field rather than redefine it. bdist_wheel's egg-info
conversion assumes the only useful requirements are in the setuptools
requires.txt. It would make sense to go ahead and delete the obsolete
fields, I'm sure they were overlooked because they are not common in the
wild.

>From PEP 345:

   - Deprecated fields:
  - Requires (in favor of Requires-Dist)
  - Provides (in favor of Provides-Dist)
  - Obsoletes (in favor of Obsoletes-Dist)


On Wed, Feb 15, 2017 at 10:31 AM Vinay Sajip 
wrote:

> > the full METADATA format is documented in the pre-JSON revision of PEP
> 426.
>
> Can you confirm which exact revision in the PEPs repo you mean? I could
> guess at
> 0451397. That version does not refer to a field "Requires" (rather, the
> more recent
> "Requires-Dist"). Your conversion function reads the existing PKG-INFO,
> updates the
> Metadata-Version, and adds "Provides-Dist" and "Requires-Dist". It does
> not check
> whether the result conforms to that version of the PEP. For example, in
> the presence
> of "Requires" in PKG-INFO, you add "Requires-Dist", possibly leading to an
> ambiguity,
> because they sort of mean the same thing but could contain conflicting
> information
> (for example, different version constraints). The python-dateutils wheel
> which Jim
> referred to contained both "Requires" and "Requires-Dist" fields in its
> METADATA
> file, and, faced with a metadata set with both fields, the old packaging
> code used
> by distlib to handle the different metadata versions raised a "Unknown
> metadata set"
> error. In the face of ambiguity, it's refusing the temptation to guess :-)
>
> If the conversion function adds "Requires-Dist" but doesn't remove
> "Requires", I'm not
> sure it conforms to that version of the PEP.
>
> Regards,
>
> Vinay Sajip
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Vinay Sajip via Distutils-SIG
> the full METADATA format is documented in the pre-JSON revision of PEP 426.

Can you confirm which exact revision in the PEPs repo you mean? I could guess 
at 
0451397. That version does not refer to a field "Requires" (rather, the more 
recent 
"Requires-Dist"). Your conversion function reads the existing PKG-INFO, updates 
the 
Metadata-Version, and adds "Provides-Dist" and "Requires-Dist". It does not 
check 
whether the result conforms to that version of the PEP. For example, in the 
presence 
of "Requires" in PKG-INFO, you add "Requires-Dist", possibly leading to an 
ambiguity, 
because they sort of mean the same thing but could contain conflicting 
information 
(for example, different version constraints). The python-dateutils wheel which 
Jim 
referred to contained both "Requires" and "Requires-Dist" fields in its 
METADATA 
file, and, faced with a metadata set with both fields, the old packaging code 
used 
by distlib to handle the different metadata versions raised a "Unknown metadata 
set" 
error. In the face of ambiguity, it's refusing the temptation to guess :-) 

If the conversion function adds "Requires-Dist" but doesn't remove "Requires", 
I'm not 
sure it conforms to that version of the PEP.

Regards, 

Vinay Sajip
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Paul Moore
On 15 February 2017 at 14:11, Nathaniel Smith  wrote:
>> It's mainly a matter of incorporating
>> https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
>> data model, as this distinction between abstract development
>> dependencies and concrete deployment dependencies is incredibly
>> important for any scenario that involves
>> publisher-redistributor-consumer chains, but is entirely non-obvious
>> to folks that are only familiar with the publisher-consumer case that
>> comes up during development-for-personal-and-open-source-use.
>
> Maybe I'm just being dense but, umm. I don't know what any of these
> words mean :-). I'm not unfamiliar with redistributors; part of my
> confusion is that this is a concept that AFAIK distro package systems
> don't have. Maybe it would help if you have a concrete example of a
> scenario where they would benefit from having this distinction?

I'm also finding this discussion bafflingly complex. I understand that
distributions need a way to work with Python packages, but the issues
involved seem completely divorced from the basic process of a user
using pip to install a package with the dependencies it needs to work
in their program.

The package metadata standardisation process seems to be falling foul
of a quest for perfection. Is there no 80% solution that covers the
bulk of use cases (which, in my mind, are all around some user wanting
to say "pip install" to grab some stuff off PyPI to build his
project)? Or is the 80% solution precisely what we have at the moment,
in which case can't we standardise what we have, and *then* look to
extend to cover the additional requirements?

I'm sure I'm missing something - but honestly, I'm not sure what it is.

If I write something to go on PyPI, I assume that makes me a
"publisher"? IMO, my audience is people who use my software (the
"consumers" in your terms, I guess). While I'd be pleased if a
distributor like Ubuntu or Fedora or Anaconda wanted to include my
package in their distribution, I wouldn't see them as my end users -
so while I'd be OK with tweaking my code/metadata to accommodate their
needs, it's not a key goal for me. And learning all the metadata
concepts related to packaging my project for distributors wouldn't
appeal to me at all. I'd be happy for the distributions to to that and
send me PRs, but the burden should be on them to do that. The
complexities we're debating here seem to be based on the idea that *I*
should understand the distributor's role in order to package my code
"correctly". I'm not at all sure I agree with that.

Maybe this is all a consequence of Python now being used in "big
business", and the old individual developer scratching his or her own
itch model is gone. And maybe that means PyPI is no longer a suitable
place for such "use at your own risk" code But if that's the case,
maybe we need to acknowledge that fact, before we end up with people
getting the idea that "Python packaging is too complex for the average
developer". Because it's starting to feel that way :-(

Paul
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nick Coghlan
 On 15 February 2017 at 14:00, Wes Turner  wrote:
> On Wed, Feb 15, 2017 at 5:33 AM, Nick Coghlan  wrote:
>> I asked Daniel to *stop* using pydist.json, since wheel was emitting a
>> point-in-time snapshot of PEP 426 (which includes a lot of
>> potentially-nice-to-have things that nobody has actually implemented
>> so far, like the semantic dependency declarations and the enhancements
>> to the extras syntax), rather than the final version of the spec.
>
> Would you send a link to the source for this?

It came up when Vinay reported a problem with the way bdist_wheel was
handling combined extras and environment marker definitions:
https://bitbucket.org/pypa/wheel/issues/103/problem-with-currently-generated

>> - dist-info/METADATA as defined at
>> https://packaging.python.org/specifications/#package-distribution-metadata
>> - dist-info/requires.txt runtime dependencies as defined at
>> http://setuptools.readthedocs.io/en/latest/formats.html#requires-txt
>> - dist-info/setup_requires.txt build time dependencies as defined at
>> http://setuptools.readthedocs.io/en/latest/formats.html#setup-requires-txt
>>
>> The dependency fields in METADATA itself unfortunately aren't really
>> useful for anything.
>
> Graph: Nodes and edges.

Unfortunately, it's not that simple, since:

- dependency declarations refer to time dependent node *sets*, not to
specific edges
- node resolution is not only time dependent, but also DNS and client
configuration dependent
- this is true even for "pinned" dependencies due to the way "=="
handles post-releases and local build IDs
- the legacy module based declarations are inconsistently populated
and don't refer to nodes by a useful name
- the new distribution package based declarations refer to nodes by a
useful name, but largely aren't populated

By contrast, METADATA *does* usefully define nodes in the graph, while
requires.txt and setup_requires.txt can be used to extract edges when
combined with suitable additional data sources (primarily a nominated
index server or set of index servers to use for dependency specifier
resolution).

>> There's definitely still a place for a pydist.json created by going
>> through PEP 426, comparing it to what bdist_wheel already does to
>> populate metadata.json, and either changing the PEP to match the
>> existing practice, or else agreeing that we prefer what the PEP
>> recommends, that we want to move in that direction, and that there's a
>> definite commitment to implement the changes in at least setuptools
>> and bdist_wheel (plus a migration strategy that allows for reasonably
>> sensible consumption of old metadata).
>
> Which function reads metadata.json?

Likely eventually nothing, since anything important that it contains
will be readable from either pydist.json or from the other legacy
metadata files.

> Which function reads pydist.json?

Eventually everything, with tools falling back to dynamically
generating it from legacy metadata formats as a transition plan to
handle component releases made with older toolchains.

> An RDFS Vocabulary contains Classes and Properties with rdfs:ranges and
> rdfs:domains.
>
> There are many representations for RDF: RDF/XML, Turtle/N3, JSONLD.
>
> RDF is implementation-neutral. JSONLD is implementation-neutral.

While true, both of these are still oriented towards working with a
*resolved* graph snapshot, rather than a deliberately underspecified
graph description that requires subsequent resolution within the node
set of a particular index server (or set of index servers).

Just incorporating the time dimension is already messy, even before
accounting for the fact that the metadata carried with along the
artifacts is designed to be independent of the particular server that
happens to be hosting it.

Tangent: if anyone is looking for an open source stack for working
with distributed graph storage manipulation from Python, the
combination of http://janusgraph.org/ and
https://pypi.org/project/gremlinpython/ is well worth a look ;)

>> The equivalent for PEP 426 would probably be legacy-to-pydist and
>> pydist-to-legacy converters that setuptools, bdist_wheel and other
>> publishing tools can use to ship legacy metadata alongside the
>> standardised format (and I believe Daniel already has at least the
>> former in order to generate metadata.json in bdist_wheel). With PEP
>> 426 as currently written, a pydist-to-legacy converter isn't really
>> feasible, since pydist proposes new concepts that can't be readily
>> represented in the old format.
>
> pydist-to-legacy would be a lossy transformation.

Given appropriate use of the "extras" system and a couple of new
METADATA fields, it doesn't have to be, at least for the initial
version - that's the new design constraint I'm proposing for
everything that isn't defined as a metadata extension.

The rationale being that if legacy dependency metadata can be reliably
generated from the new format, that creates an 

Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Daniel Holth
Wheel puts everything important in METADATA, except entry_points.txt. The
requirements expressed there under 'Requires-Dist' are reliable, and the
full METADATA format is documented in the pre-JSON revision of PEP 426. At
runtime, once pkg_resources parses it, *.egg-info and *.dist-info look
identical, because it's just a different way to represent the same data.
Wheel's version of METADATA exists as the simplest way to add the critical
'extras' feature to distutils2-era *.dist-info/METADATA, necessary to
losslessly represent setuptools packages in a more PEP-standard way. I
could have completely redesigned the METADATA format instead of extending
it, but then I would have run out of time and wheel would not exist.

This function converts egg-info metadata to METADATA
https://bitbucket.org/pypa/wheel/src/54ddbcc9cec25e1f4d111a142b8bfaa163130a61/wheel/metadata.py?at=default=file-view-default#metadata.py-240

This one converts to the JSON format. It looks like it might work with
PKG-INFO or METADATA.
https://bitbucket.org/pypa/wheel/src/54ddbcc9cec25e1f4d111a142b8bfaa163130a61/wheel/metadata.py?at=default=file-view-default#metadata.py-98


On Wed, Feb 15, 2017 at 8:27 AM Nick Coghlan  wrote:

> On 15 February 2017 at 12:58, Nathaniel Smith  wrote:
> > On Wed, Feb 15, 2017 at 3:33 AM, Nick Coghlan 
> wrote:
> >> - "requires": list where entries are either a string containing a PEP
> >> 508 dependency specifier or else a hash map contain a "requires" key
> >> plus "extra" or "environment" fields as qualifiers
> >> - "integrates": replacement for "meta_requires" that only allows
> >> pinned dependencies (i.e. hash maps with "name" & "version" fields, or
> >> direct URL references, rather than a general PEP 508 specifier as a
> >> string)
> >
> > What's accomplished by separating these? I really think we should
> > strive to have fewer more orthogonal concepts whenever possible...
>
> It's mainly a matter of incorporating
> https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
> data model, as this distinction between abstract development
> dependencies and concrete deployment dependencies is incredibly
> important for any scenario that involves
> publisher-redistributor-consumer chains, but is entirely non-obvious
> to folks that are only familiar with the publisher-consumer case that
> comes up during development-for-personal-and-open-source-use.
>
> One particular area where this is problematic is in the widespread
> advice "always pin your dependencies" which is usually presented
> without the all important "for application or service deployment"
> qualifier. As a first approximation:
> pinning-for-app-or-service-deployment == good,
> pinning-for-local-testing == good,
> pinning-for-library-or-framework-publication-to-PyPI == bad.
>
> pipenv borrows the Ruby solution to modeling this by having Pipfile
> for abstract dependency declarations and Pipfile.lock for concrete
> integration testing ones, so the idea here is to propagate that model
> to pydist.json by separating the "requires" field with abstract
> development dependencies from the "integrates" field with concrete
> deployment dependencies.
>
> In the vast majority of publication-to-PyPi cases people won't need
> the "integrates" field, since what they're publishing on PyPI will
> just be their abstract dependencies, and any warning against using
> "==" will recommend using "~=" or ">=" instead. But there *are*
> legitimate uses of pinning-for-publication (like the PyObjC
> metapackage bundling all its subcomponents, or when building for
> private deployment infastructure), so there needs to be a way to
> represent "Yes, I'm pinning this dependency for publication, and I'm
> aware of the significance of doing so"
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
> ___
> Distutils-SIG maillist  -  Distutils-SIG@python.org
> https://mail.python.org/mailman/listinfo/distutils-sig
>
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nathaniel Smith
On Wed, Feb 15, 2017 at 5:27 AM, Nick Coghlan  wrote:
> On 15 February 2017 at 12:58, Nathaniel Smith  wrote:
>> On Wed, Feb 15, 2017 at 3:33 AM, Nick Coghlan  wrote:
>>> - "requires": list where entries are either a string containing a PEP
>>> 508 dependency specifier or else a hash map contain a "requires" key
>>> plus "extra" or "environment" fields as qualifiers
>>> - "integrates": replacement for "meta_requires" that only allows
>>> pinned dependencies (i.e. hash maps with "name" & "version" fields, or
>>> direct URL references, rather than a general PEP 508 specifier as a
>>> string)
>>
>> What's accomplished by separating these? I really think we should
>> strive to have fewer more orthogonal concepts whenever possible...
>
> It's mainly a matter of incorporating
> https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
> data model, as this distinction between abstract development
> dependencies and concrete deployment dependencies is incredibly
> important for any scenario that involves
> publisher-redistributor-consumer chains, but is entirely non-obvious
> to folks that are only familiar with the publisher-consumer case that
> comes up during development-for-personal-and-open-source-use.

Maybe I'm just being dense but, umm. I don't know what any of these
words mean :-). I'm not unfamiliar with redistributors; part of my
confusion is that this is a concept that AFAIK distro package systems
don't have. Maybe it would help if you have a concrete example of a
scenario where they would benefit from having this distinction?

> One particular area where this is problematic is in the widespread
> advice "always pin your dependencies" which is usually presented
> without the all important "for application or service deployment"
> qualifier. As a first approximation:
> pinning-for-app-or-service-deployment == good,
> pinning-for-local-testing == good,
> pinning-for-library-or-framework-publication-to-PyPI == bad.
>
> pipenv borrows the Ruby solution to modeling this by having Pipfile
> for abstract dependency declarations and Pipfile.lock for concrete
> integration testing ones, so the idea here is to propagate that model
> to pydist.json by separating the "requires" field with abstract
> development dependencies from the "integrates" field with concrete
> deployment dependencies.

What's the benefit of putting this in pydist.json? I feel like for the
usual deployment cases (a) going straight from Pipfile.lock -> venv is
pretty much sufficient, with no need to put this into a package, but
(b) if you really do want to put it into a package, then the natural
approach would be to make an empty wheel like
"my-django-app-deploy.whl" whose dependencies were the contents of
Pipfile.lock.

There's certainly a distinction to be made between the abstract
dependencies and the exact locked dependencies, but to me the natural
way to model that distinction is by re-using the distinction we
already have been source packages and binary packages. The build
process for this placeholder wheel is to "compile down" the abstract
dependencies into concrete dependencies, and the resulting wheel
encodes the result of this compilation. Again, no new concepts needed.

> In the vast majority of publication-to-PyPi cases people won't need
> the "integrates" field, since what they're publishing on PyPI will
> just be their abstract dependencies, and any warning against using
> "==" will recommend using "~=" or ">=" instead. But there *are*
> legitimate uses of pinning-for-publication (like the PyObjC
> metapackage bundling all its subcomponents, or when building for
> private deployment infastructure), so there needs to be a way to
> represent "Yes, I'm pinning this dependency for publication, and I'm
> aware of the significance of doing so"

Why can't PyObjC just use regular dependencies? That's what distro
metapackages have done for decades, right?

-n

-- 
Nathaniel J. Smith -- https://vorpus.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nick Coghlan
On 15 February 2017 at 12:58, Nathaniel Smith  wrote:
> On Wed, Feb 15, 2017 at 3:33 AM, Nick Coghlan  wrote:
>> - "requires": list where entries are either a string containing a PEP
>> 508 dependency specifier or else a hash map contain a "requires" key
>> plus "extra" or "environment" fields as qualifiers
>> - "integrates": replacement for "meta_requires" that only allows
>> pinned dependencies (i.e. hash maps with "name" & "version" fields, or
>> direct URL references, rather than a general PEP 508 specifier as a
>> string)
>
> What's accomplished by separating these? I really think we should
> strive to have fewer more orthogonal concepts whenever possible...

It's mainly a matter of incorporating
https://caremad.io/posts/2013/07/setup-vs-requirement/ into the core
data model, as this distinction between abstract development
dependencies and concrete deployment dependencies is incredibly
important for any scenario that involves
publisher-redistributor-consumer chains, but is entirely non-obvious
to folks that are only familiar with the publisher-consumer case that
comes up during development-for-personal-and-open-source-use.

One particular area where this is problematic is in the widespread
advice "always pin your dependencies" which is usually presented
without the all important "for application or service deployment"
qualifier. As a first approximation:
pinning-for-app-or-service-deployment == good,
pinning-for-local-testing == good,
pinning-for-library-or-framework-publication-to-PyPI == bad.

pipenv borrows the Ruby solution to modeling this by having Pipfile
for abstract dependency declarations and Pipfile.lock for concrete
integration testing ones, so the idea here is to propagate that model
to pydist.json by separating the "requires" field with abstract
development dependencies from the "integrates" field with concrete
deployment dependencies.

In the vast majority of publication-to-PyPi cases people won't need
the "integrates" field, since what they're publishing on PyPI will
just be their abstract dependencies, and any warning against using
"==" will recommend using "~=" or ">=" instead. But there *are*
legitimate uses of pinning-for-publication (like the PyObjC
metapackage bundling all its subcomponents, or when building for
private deployment infastructure), so there needs to be a way to
represent "Yes, I'm pinning this dependency for publication, and I'm
aware of the significance of doing so"

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Wes Turner
On Wed, Feb 15, 2017 at 5:33 AM, Nick Coghlan  wrote:

> On 14 February 2017 at 21:21, Vinay Sajip via Distutils-SIG
>  wrote:
> >
> >
> >> I thought the current status was that it's called metadata.json
> >> exactly *because* it's not standardized, and you *shouldn't* look at
> >> it?
> >
> >
> > Well, it was work-in-progress-standardised according to PEP 426 (since
> > sometimes implementations have to work in parallel with working out the
> > details of specifications). Given that PEP 426 wasn't done and dusted
> > but being progressed, I would have thought it perfectly acceptable to
> > use "pydist.json", as the only things that would be affected would be
> > packaging tools working to the PEP.
>
> I asked Daniel to *stop* using pydist.json, since wheel was emitting a
> point-in-time snapshot of PEP 426 (which includes a lot of
> potentially-nice-to-have things that nobody has actually implemented
> so far, like the semantic dependency declarations and the enhancements
> to the extras syntax), rather than the final version of the spec.
>

Would you send a link to the source for this?


>
> >> It's too bad that the JSON thing didn't work out, but I think we're
> >> better off working on better specifying the one source of truth
> >> everything already uses (METADATA) instead of bringing in *new*
> >> partially-incompatible-and-poorly-specified formats.
> >
> > When you say "everything already uses", do you mean setuptools and wheel?
> > If nobody else is allowed to play, that's one thing. But otherwise, there
> > need to be standards for interoperability. The METADATA file, now -
> exactly
> > which standard does it follow? The one in the dateutil wheel that Jim
> > referred to doesn't appear to conform to any of the metadata PEPs. It was
> > rejected by old metadata code in distlib (which came of out the Python
> 3.3
> > era "packaging" package - not to be confused with Donald's of the same
> name -
> > which is strict in its interpretation of those earlier PEPs).
> >
> > The METADATA format (key-value) is not really flexible enough for certain
> > things which were in PEP 426 (e.g. dependency descriptions), and for
> these
> > JSON seems a reasonable fit.
>
> The current de facto standard set by setuptools and bdist_wheel is:
>
> - dist-info/METADATA as defined at
> https://packaging.python.org/specifications/#package-distribution-metadata
> - dist-info/requires.txt runtime dependencies as defined at
> http://setuptools.readthedocs.io/en/latest/formats.html#requires-txt
> - dist-info/setup_requires.txt build time dependencies as defined at
> http://setuptools.readthedocs.io/en/latest/formats.html#setup-requires-txt
>
> The dependency fields in METADATA itself unfortunately aren't really
> useful for anything.
>

Graph: Nodes and edges.


>
> There's definitely still a place for a pydist.json created by going
> through PEP 426, comparing it to what bdist_wheel already does to
> populate metadata.json, and either changing the PEP to match the
> existing practice, or else agreeing that we prefer what the PEP
> recommends, that we want to move in that direction, and that there's a
> definite commitment to implement the changes in at least setuptools
> and bdist_wheel (plus a migration strategy that allows for reasonably
> sensible consumption of old metadata).
>

Which function reads metadata.json?
Which function reads pydist.json?


>
> Such an update would necessarily be a fairly ruthless process, where
> we defer everything that can possibly be deferred. I already made one
> pass at that when I split out the metadata extensions into PEP 459,
> but at least one more such pass is needed before we can sign off on
> the spec as metadata 2.0 - even beyond any "open for discussion"
> questions, there are still things in there which were extracted and
> standardised separately in PEP 508.
>
> > There's no technical reason why "the JSON thing
> > didn't work out", as far as I can see - it was just given up on for a
> more
> > incremental approach (which has got no new PEPs other than 440, AFAICT).
>
> Yep, it's a logistical problem rather than a technical problem per se
> - new metadata formats need software publisher adoption to ensure the
> design is sensible before we commit to them long term, but software
> publishers are understandably reluctant to rely on new formats that
> limit their target audience to folks running the latest versions of
> the installation tools (outside constrained cases where the software
> publisher is also the main consumer of that software).
>

An RDFS Vocabulary contains Classes and Properties with rdfs:ranges and
rdfs:domains.

There are many representations for RDF: RDF/XML, Turtle/N3, JSONLD.

RDF is implementation-neutral. JSONLD is implementation-neutral.



>
> For PEP 440 (version specifiers) and PEP 508 (dependency specifiers),
> this was handled by focusing on documenting practices that people
> already used (and checking 

[Distutils] Python installation not working

2017-02-15 Thread Chitra Dewan via Distutils-SIG
Hello,
I am beginner in Python I am facing problems in installing Python 3.5  on my 
windows vista x32 machine.I downloaded python-3.5.2.exe from Python.org. It is 
downloaded as an exe. When I try to install it via  "Run as administrator" , 
nothing happens.  Same behavior with 3.6 version 
kindly advise 
 Regards & Thanks, Chitra Dewan___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


[Distutils] py2exe package for 2.7

2017-02-15 Thread Venkat Ram Reddy K
Hi Good Afternoon,

This is Venkat from HCL Technologies. Actually I have created executable 
file(test.exe) by using py2exe package on python 2.7 version on Windows.

After that I have ran my application from the path C:\Python27\dist\test.exe, 
It was executed and working properly.

But the problem is, when I have copied test.exe to other folder(other than 
"C:\Python27\dist\")and tried to run the test.exe, it is not executing.

Could you please help me in resolving the issue.

Thanks,
Venkat.



::DISCLAIMER::


The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.


___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nathaniel Smith
On Wed, Feb 15, 2017 at 3:33 AM, Nick Coghlan  wrote:
> - "requires": list where entries are either a string containing a PEP
> 508 dependency specifier or else a hash map contain a "requires" key
> plus "extra" or "environment" fields as qualifiers
> - "integrates": replacement for "meta_requires" that only allows
> pinned dependencies (i.e. hash maps with "name" & "version" fields, or
> direct URL references, rather than a general PEP 508 specifier as a
> string)

What's accomplished by separating these? I really think we should
strive to have fewer more orthogonal concepts whenever possible...

-n

-- 
Nathaniel J. Smith -- https://vorpus.org
___
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig


Re: [Distutils] distlib and wheel metadata

2017-02-15 Thread Nick Coghlan
On 14 February 2017 at 21:21, Vinay Sajip via Distutils-SIG
 wrote:
>
>
>> I thought the current status was that it's called metadata.json
>> exactly *because* it's not standardized, and you *shouldn't* look at
>> it?
>
>
> Well, it was work-in-progress-standardised according to PEP 426 (since
> sometimes implementations have to work in parallel with working out the
> details of specifications). Given that PEP 426 wasn't done and dusted
> but being progressed, I would have thought it perfectly acceptable to
> use "pydist.json", as the only things that would be affected would be
> packaging tools working to the PEP.

I asked Daniel to *stop* using pydist.json, since wheel was emitting a
point-in-time snapshot of PEP 426 (which includes a lot of
potentially-nice-to-have things that nobody has actually implemented
so far, like the semantic dependency declarations and the enhancements
to the extras syntax), rather than the final version of the spec.

>> It's too bad that the JSON thing didn't work out, but I think we're
>> better off working on better specifying the one source of truth
>> everything already uses (METADATA) instead of bringing in *new*
>> partially-incompatible-and-poorly-specified formats.
>
> When you say "everything already uses", do you mean setuptools and wheel?
> If nobody else is allowed to play, that's one thing. But otherwise, there
> need to be standards for interoperability. The METADATA file, now - exactly
> which standard does it follow? The one in the dateutil wheel that Jim
> referred to doesn't appear to conform to any of the metadata PEPs. It was
> rejected by old metadata code in distlib (which came of out the Python 3.3
> era "packaging" package - not to be confused with Donald's of the same name -
> which is strict in its interpretation of those earlier PEPs).
>
> The METADATA format (key-value) is not really flexible enough for certain
> things which were in PEP 426 (e.g. dependency descriptions), and for these
> JSON seems a reasonable fit.

The current de facto standard set by setuptools and bdist_wheel is:

- dist-info/METADATA as defined at
https://packaging.python.org/specifications/#package-distribution-metadata
- dist-info/requires.txt runtime dependencies as defined at
http://setuptools.readthedocs.io/en/latest/formats.html#requires-txt
- dist-info/setup_requires.txt build time dependencies as defined at
http://setuptools.readthedocs.io/en/latest/formats.html#setup-requires-txt

The dependency fields in METADATA itself unfortunately aren't really
useful for anything.

There's definitely still a place for a pydist.json created by going
through PEP 426, comparing it to what bdist_wheel already does to
populate metadata.json, and either changing the PEP to match the
existing practice, or else agreeing that we prefer what the PEP
recommends, that we want to move in that direction, and that there's a
definite commitment to implement the changes in at least setuptools
and bdist_wheel (plus a migration strategy that allows for reasonably
sensible consumption of old metadata).

Such an update would necessarily be a fairly ruthless process, where
we defer everything that can possibly be deferred. I already made one
pass at that when I split out the metadata extensions into PEP 459,
but at least one more such pass is needed before we can sign off on
the spec as metadata 2.0 - even beyond any "open for discussion"
questions, there are still things in there which were extracted and
standardised separately in PEP 508.

> There's no technical reason why "the JSON thing
> didn't work out", as far as I can see - it was just given up on for a more
> incremental approach (which has got no new PEPs other than 440, AFAICT).

Yep, it's a logistical problem rather than a technical problem per se
- new metadata formats need software publisher adoption to ensure the
design is sensible before we commit to them long term, but software
publishers are understandably reluctant to rely on new formats that
limit their target audience to folks running the latest versions of
the installation tools (outside constrained cases where the software
publisher is also the main consumer of that software).

For PEP 440 (version specifiers) and PEP 508 (dependency specifiers),
this was handled by focusing on documenting practices that people
already used (and checking existing PyPI projects for compatibility),
rather than trying to actively change those practices.

For pyproject.toml (e.g. enscons), the idea is to provide a setup.py
shim that can take care of bootstrapping the new approach for the
benefit of older tools that assume the use of setup.py (similar to
what was done with setup.cfg and d2to1).

The equivalent for PEP 426 would probably be legacy-to-pydist and
pydist-to-legacy converters that setuptools, bdist_wheel and other
publishing tools can use to ship legacy metadata alongside the
standardised format (and I believe Daniel already has at least the
former in order to