Maybe we should get on the namespaces bandwagon and allow organizations to register a prefix. Then you would be able to know that dependencies called "company/mysupersecretprogram" would never accidentally exist on pypi
On Fri, Jul 25, 2014 at 9:21 AM, Nick Coghlan <[email protected]> wrote: > On 25 July 2014 23:13, Richard Jones <[email protected]> wrote: >>> Yes, those are two solutions, another solution is for PyPI to allow >>> registering a namespace, like dstufft.* and companies simply name all their >>> packages that. This isn’t a unique problem to this PEP though. This problem >>> exists anytime a company has an internal package that they do not want on >>> PyPI. It’s unlikely that any of those companies are using the external link >>> feature if that package is internal. >> >> As i mentioned, using devpi solves this issue for companies hosting internal >> indexes. Requiring companies to register names on a public index to avoid >> collision has been raised a few times along the lines of "I hope we don't >> have to register names on the public index to avoid this." :) > > Restricting packages to come from particular indexes is (or should be) > independent of the PEP 470 design. pip has multiple index support > today, and if you enable it, any enabled index can currently provide > any package. > > If that's a significant concern for anyone, changing it is just a pip > RFE rather than needing to be part of a PEP. > >>> > There still remains the usability issue of unsophisticated users running >>> > into external indexes and needing to cope with that in one of a myriad of >>> > ways as evidenced by the PEP. One solution proposed and refined at the >>> > EuroPython gathering today has PyPI caching packages from external indexes >>> > *for packages registered with PyPI*. That is: a requirement of registering >>> > your package (and external index URL) with PyPI is that you grant PyPI >>> > permission to cache packages from your index in the central index - a >>> > scenario that is ideal for users. Organisations not wishing to do that >>> > understand that they're the ones causing the pain for users. >>> >>> We can’t cache the packages which aren’t currently hosted on PyPI. Not in >>> an automatic fashion anyways. We’d need to ensure that their license allows >>> us to do so. The PyPI ToS ensures this when they upload but if they never >>> upload then they’ve never agreed to the ToS for that artifact. >> >> I didn't state it clearly: this would be opt-in with the project granting >> PyPI permission to perform this caching. Their option is to not do so and >> simply not have a listing on PyPI. > > This is exactly the "packages not hosted on PyPI are second class > citizens" scenario we're trying to *avoid*. We can't ask a global > community to comply with US export laws just to be listed on the main > community index. > >>> > An extension of this proposal is quite elegant; to reduce the pain of >>> > migration from the current approach to the new, we implement that caching >>> > right now, using the current simple index scraping. This ensures the >>> > packages are available to all clients throughout the transition period. >>> >>> As said above, we can’t legally do this automatically, we’d need to ensure >>> that there is a license that grants us distribution rights. >> >> A variation on the above two ideas is to just record the *link* to the >> externally-hosted file from PyPI, rather than that file's content. It is >> more error-prone, but avoids issues of file ownership. > > This is essentially what PEP 470 proposes, except that the link says > "this project is hosted on this external index, check there for the > relevant details" rather than having individual links for every > externally hosted version. > > Cheers, > Nick. and there would be great rejoicing. IIUC conda's binstar does something like this... _______________________________________________ Distutils-SIG maillist - [email protected] https://mail.python.org/mailman/listinfo/distutils-sig
