On 16 December 2015 at 16:40, Glyph Lefkowitz <[email protected]> wrote: > On Dec 15, 2015, at 8:56 PM, Chris Jerdonek <[email protected]> > wrote: > Thanks for any help or suggestions. > > > This is what I'm doing right now (occasionally manually curating the output > of `pip freeze“) but I have heard good things about > https://github.com/nvie/pip-tools/ and I intend to investigate it. As I > understand it, pip-compile is the tool you want.
I just ran across pip-tools recently myself, and while I haven't actually tried it out yet, I think the design makes a lot of sense for the VCS-based deployment case: * you write a requirements.in file with your direct dependencies (including any pinning required for API compatibility) * pip-compile turns that into a requirements.txt that pins all your dependencies to their latest versions * pip-sync makes a virtualenv *exactly* match a requirements.txt file (installing, uninstalling, upgrading and downgrading as needed) That makes "upgrade dependencies" (by rerunning pip-compile) a clearly distinct operation from "deploy current dependencies" (by using pip-sync in an existing environment, or by installing into a fresh environment based on the generated requirements.txt). I'm not sure it makes as much sense in the case where the thing you're working on is itself a distributable Python package with its own setup.py - it seems like you'd end up duplicating information between setup.py and requirements.in. Cheers, Nick. -- Nick Coghlan | [email protected] | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - [email protected] https://mail.python.org/mailman/listinfo/distutils-sig
