Hi!
I write, use and reuse a lot of small python programs for variuos purposes in
my work. These use a growing number of utility modules that I'm continuously
developing and adding to as new functionality is needed. Sometimes I discover
earlier design mistakes in these modules, and rather than keeping old garbage I
often rewrite the parts that are unsatisfactory. This often breaks backwards
compatibility, and since I don't feel like updating all the code that relies on
the old (functional but flawed) modules, I'm left with a hack library that
depends on halting versions of my utility modules. The way I do it now is that
I update the programs as needed when I need them, but this approach makes me
feel a bit queasy. It seems to me like I'm thinking about this in the wrong way.
Does anyone else recognize this situation in general? How do you handle it?
I have a feeling it should be possible to have multiple versions of the modules
installed simultaneously, and maybe do something like this:
mymodule/
+ mymodule-1.1.3/
+ mymodule-1.1.0/
+ mymodule-0.9.5/
- __init__.py
and having some kind of magic in __init__.py that let's the programmer choose
version after import:
import mymodule
mymodule.require_version("1.1.3")
Is this a good way of thinking about it? What would be an efficient way of
implementing it?
Cheers!
/Joel Hedlund
--
http://mail.python.org/mailman/listinfo/python-list