On 01/20/2015 08:15 PM, Robert Collins wrote:
> On 21 January 2015 at 10:21, Clark Boylan <cboy...@sapwetik.org> wrote:
> ...
>> This ml thread came up in the TC meeting today and I am responding here
>> to catch the thread up with the meeting. The soft update option is the
>> suggested fix for non openstack projects that want to have most of their
>> requirements managed by global requirements.
>>
>> For the project structure reform opening things up we should consider
>> loosening the criteria to get on the list and make it primarily based on
>> technical criteria such as py3k support, license compatibility, upstream
>> support/activity, and so on (basically the current criteria with less of
>> a focus on where the project comes from if it is otherwise healthy).
>> Then individual projects would choose the subset they need to depend on.
>> This model should be viable with different domains as well if we go that
>> route.
>>
>> The following is not from the TC meeting but addressing other portions
>> of this conversation:
>>
>> At least one concern with this option is that as the number of total
>> requirements goes up is the difficulty in debugging installation
>> conflicts becomes more difficult too. I have suggested that we could
>> write tools to help with this. Install bisection based on pip logs for
>> example, but these tools are still theoretical so I may be
>> overestimating their usefulness.
>>
>> To address the community scaling aspect I think you push a lot of work
>> back on deployers/users if we don't curate requirements for anything
>> that ends up tagged as "production ready" (or whatever the equivalent
>> tag becomes). Essentially we are saying "this doesn't scale for us so
>> now you deal with the fallout. Have fun", which isn't very friendly to
>> people consuming the software. We already have an absurd number of
>> requirements and management of them has appeared to scale. I don't
>> foresee my workload going up if we open up the list as suggested.
> 
> Perhaps I missed something, but the initial request wasn't about
> random packages, it was about other stackforge clients - these are
> things in the ecosystem! I'm glad we have technical solutions, but it
> just seems odd to me that adding them would ever have been
> controversial.

Well, I think Clark and I have different opinions of how much of a pain
unwinding the requirements are, and how long these tend to leave the
gate broken. I am happy to also put it in a "somebody elses problem
field" for resolving the issues. :)

Honestly, I think we're actually at a different point, where we need to
stop assuming that the sane way to deal with python is to install it
into system libraries, and just put every service in a venv and get rid
of global requirements entirely. Global requirements was a scaling fix
for getting to 10 coexisting projects. I don't think it actually works
well with 50 ecosystem projects. Which is why I proposed the domains
solution instead.

> On the pip solver side, joe gordon was working on a thing to install a
> fixed set of packages by bypassing the pip resolver... not sure how
> thats progressing.

I think if we are talking seriously about bypassing the pip resolver, we
should step back and think about that fact. Because now we're producting
a custom installation process that will produce an answer for us, which
is completely different than any answer that anyone else is getting for
how to get a coherent system.

        -Sean

-- 
Sean Dague
http://dague.net

__________________________________________________________________________
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev

Reply via email to