There are a few approaches I've seen:

1. Embed an api version number into the package/module name, e.g.
sompackage.api0. I've seen this once before, don't think it's a great
way to go (strikes me as generating a lot of work). However it does
work if you manage it.
2. Use distribute (née setuptools) to manage multiple versions of
dependencies on your system install. Note that due to the way python
works you can't have multiple versions of eggs active at runtime.
3. Use virtualenv and pip to bundle up your dependencies into mini
"virtual" installs for each app. This essentially creates a self
contained install of python which sneakily symlinks back to the system
one for python itself.
4. buildout is a variation of the previous point (does a build based on a spec).
5. Bundle dependencies with each application by hand. Appengine kind
of encourages this, for example python.ie has a jinja2.zip with jinja2
inside it. This makes checkouts easy peasy to get working.

Generally speaking option (2) is probably closest to the libfoo
linking approach. However I'd recommend going with (3) or (4), since
this is becoming recommended by many site/application maintainers as
the best way to manage dependencies for an application. virtualenv
exploits the fact python makes it very easy to bundle up dependencies.
This means you can easily manage multiple variations of libraries
without going crazy. This does mean inverting your thinking about how
to install libraries, i.e. keep the system empty and bundle all deps
into each app.

Coincidentally 
http://opensource.washingtontimes.com/blog/post/coordt/2010/01/how-we-create-and-deploy-sites-fast-virtualenv-and/
gives a good insight into one way to manage virtualenv stuff.

(I regard many of the path tricks setuptools/distribute to do as
hacks, and as such have sharp edges, so you can run into very strange
issues which can trip the unwary. A deep knowledge of python's import
mechanism is needed to figure these issues out.)

mick

2010/1/8 Maciej Bliziński <[email protected]>:
> Suppose I distribute a Python module.  It gets installed on a couple
> systems.  After a while, I need to introduce a backward-incompatible
> API change.  If I shiped the updated library, all the programs using
> the library would break.
>
> C libraries have a mechanism to deal with this problem.  Every library
> has a SONAME, for example "libfoo.so.0", which denotes the version of
> the API.  If the API of the library changes, a shared library with a
> new soname is created, for example "libfoo.so.1".  Both libfoo.so.0
> and libfoo.so.1 can be installed on a single system.  Executables or
> other libraries using libfoo can find libfoo.so.0 or libfoo.so.1
> depending on which one they've been originally linked against.
>
> Is there a similar mechanism for Python?  I can think of a couple ways
> of implementing/emulating it, but I don't want to prise an open door.
> What would you suggest?
>
> Maciej
>
> --
> You received this message because you are subscribed to the Google Groups 
> "Python Ireland" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to 
> [email protected].
> For more options, visit this group at 
> http://groups.google.com/group/pythonireland?hl=en.
>
>
>
>
-- 
You received this message because you are subscribed to the Google Groups 
"Python Ireland" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/pythonireland?hl=en.


Reply via email to