On 16 April 2015 at 04:52, David Cournapeau <courn...@gmail.com> wrote: > > > On Wed, Apr 15, 2015 at 11:32 AM, Trishank Karthik Kuppusamy > <trish...@nyu.edu> wrote: >> >> On 15 April 2015 at 11:15, David Cournapeau <courn...@gmail.com> wrote: >>> >>> >>> This is indeed the case. If you want to solve dependencies in a way that >>> works well, you want an index that describes all your available package >>> versions. >>> >>> While solving dependencies is indeed NP complete, they can be fairly fast >>> in practice because of various specificities : each rule is generally only a >>> few variables, and the rules have a specific form allowing for more >>> efficient rule representation (e.g. "at most one of variable", etc...). In >>> my experience, it is not more difficult than using graph-based algorithms, >>> and >>> >>> FWIW, at Enthought, we are working on a pure python SAT solver for >>> resolving dependencies, to solve some of those issues. I am actually hacking >>> on it right at PyCon, we hope to have a first working version end of Q2, at >>> which point it will be OSS, and reintegrated in my older project depsolver >>> (https://github.com/enthought/depsolver). >>> >> >> Awesome! Then pip could use that in the near future :) > > > That's the goal. For various reasons, it ended up easier to develop the > solver within our own package manager enstaller, but once done, extracting > it as a separate library should not be too hard. It is for example designed > to support various versioning schemes (for legacy reasons we can't use > PEP440 just yet). > > Regarding speed, initial experiments showed that even for relatively deep > graphs, the running time is taken outside the SAT solver (e.g. to generate > the rules, you need to parse the version of every package you want to > consider, and parsing 1000s of PEP440 versions is slow :) ).
My intent was to use a simple backtracking enhancement to the existing pip code, because: - there is no index of dependencies today - many packages have no wheels, so we have to run arbitrary code in a new process to produce dependency metadata - so only considering the versions we have to process seems preferrable. Are you working on integrating your thing into PIP? My current work queue is roughly: - declarative setup_requires and install_requires/extra_requires support in pip and setuptools - handling conflicts with already installed packages (https://github.com/pypa/pip/issues/2687) - then teach pip how to backtrack If you're actively working on integrating it into pip, cool. -Rob -- Robert Collins <rbtcoll...@hp.com> Distinguished Technologist HP Converged Cloud _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig