I am vaguely familiar with the regression testing that goes on
with Perl CPAN modules.  The Perl community is very test-oriented,
which is one reason they are a pleasure to hang out with.  With
most software, it seems that the tools are not available to
automate testing and perform detailed all-versions all-interfaces
testing on candidate software.  Most code developers release
the beta, wait for complaints, and, if no complaints are
career-threatening or accompanied with patches, ship.  

Part of the non-testing is versus old libraries.  At some point,
programmers say "I must use features that are only in libfoo
version 7 or higher", abandoning everyone whose systems use
libfoo versions 1 through 6.  Well, at least they are NOTICING
that they are depending on version 7 features.  Some coders
don't even bother with that.

In such an environment, why the heck do we design the loader
to require the use the same version of libfoo for every
application on the system?  Why not keep a disk copy of every
version of libfoo ever needed by any code ever run on the
system?  /usr/lib takes up about 2GB on my system - with
hard drives costing $80 for 2TB, and quintuple redundancy 
including backups), that 2GB costs me 40 cents.   With a
more granular way to tie applications to particular versions
of a library, The system could keep every version of libfoo
that is needed by any piece of code on the system, no matter
how ancient it is.  And that might cost only a couple of
bucks for 10GB of extra library.  

Yes, we still must test interdependencies between different
and independent packages, how they trade evolving data
structures, and how the different libraries interact with
the kernel and the compilers.  But most of these things
are tied relatively loosely - the interactions between
OpenOffice.org and Firefox are far fewer than the
interactions internal to each package, to associated
libraries, and to legacy content.  And these interactions
must be tested for security reasons anyway.

New distros clean up the mess, making sure all the apps
on the DVD use the latest libraries.  They do not need to
include all the legacy ones.  But if you have a good
reason to subsequently add an old app depending on old
libraries, the system should support that.

So.  At this late stage of the architecture of Linux, is it
possible to revamp ld() and the naming system for libraries?
Imagine that every application can choose the library
versions it wants to run, and every application installer
can dynamically assign paths to those libraries.  Perhaps
the loader can keep updatable "loader maps" for each
application somewhere, which help it decide which versions
of a library are usable.  With a non-permissive map, the
loader can use a 2003 version of a library forever, until
the app is updated.  Individual apps can be updated to
the latest libraries without triggering dependency hell. 

The /usr/lib disk footprint grows with time, but not nearly
as fast as the disks are growing.  The RAM footprint only
grows with the libraries actually in use at a given instant. 
So there may be 10 versions of a library, but typically
only one or two will be running at once.  A really
aggressive loader might compensate for the extra RAM usage
by loading only the portions of extra libraries that
differ, and stitching them together.  But the amount of 
RAM occupied by multiple versions of libraries will be
small, compared to the amount used by data structures,
or not returned to the system by sloppy garbage collection. 
Perhaps we can upgrade to cleaner versions of apps more
frequently if we are not bound by dependencies.

Why not?

Keith

-- 
Keith Lofstrom          [email protected]         Voice (503)-520-1993
KLIC --- Keith Lofstrom Integrated Circuits --- "Your Ideas in Silicon"
Design Contracting in Bipolar and CMOS - Analog, Digital, and Scan ICs
_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to