On Sun, 06 Aug 2000, Jeffry Smith <[EMAIL PROTECTED]> wrote:
> On more important stuff - Linus' statement on LKML about binary
> compatibility is he doesn't care. He ensures compatibility within a
> release (2.2, 2.0, etc), but between major releases, he doesn't. I've
> seen similar statements from Alan Cox.
Please don't confine binary compatibily to the kernel (& Linus) domain:
Hardware <--> drivers <--> Kernel <--> libraries <--> applications
A B C D
There are about 150-200 system calls at "C".
There are about 10000-40000 ABI calls at "D". (use nm -D and count for
yourself on RH 6.2)
In my post I was complaining about breakage at "D". (there is further
breakage with distros at the filesystem/config-file level which is a
different form of incompatibility (that also hurts!))
I have heard Linus talk a couple times and he says he basically doesn't
give a rip about "D" ;-) And he also has no control over
incompatibility there: it is the library developers (glibc, gnome,
etc.) that control application breakage there.
> Rationale:
> 1. This is Open Source world. If you have the source, you can
> recompile. If you don't have the source, it's your problem getting
> the source. Also, don't come to him with problems on systems with
> binary modules. Insist on Open Source, or don't complain to them when
> it breaks.
I don't buy this argument completely (though I love open source & etc).
Suppose I have a bunch of machines with some version of RH on it and I
decide one day I want my users to run "super-gnomo-app-v0.1". But it
requires library:
/lib/libc-2.1.5.pre.maybe.next.tuesday.so
to run. Fine, but if I upgrade to that libc (and ld-linux.so!!!) it
breaks 20% of the existing apps (e.g. 3rd party, in-house, etc).
Aaaah! I can't do both and keep my users' existing stuff working!!
Furthermore, it may not just be "recompiling", but actually hacking the
source code to work with the old or the new libraries, etc. End-users like
me shouldn't be doing this...
I believe this basically cripples small (and maybe even large) ISV's on
Linux. They can't afford the support the breakage when their customers
upgrade some other part of the system.
I for one do not have a problem with ISV's charging $$ for their
software and/or customizations... perhaps others do. Open source is
great, but I don't think it should be applied at the 100% level for all
situations. In fact, I imagine a pretty small minority of the gnhlug
list is 100% OSS systems (e.g. netscape 4.x is not OSS).
> 2. Binary compatibility means just that - if a bad feature is there,
> you have to keep it. Kruft accumulates, and changes become hard to do
> in a way that doesn't break compatibility. He's about fixing stuff -
> if there's a better way to do it, you use that, and recompile your
> source (you've got it from 1, right?).
Again, don't limit this to our hero Linus and interface "C". It is the
library developers and distros that are going thru the "Unix growing
pains" and breaking us. Some of it is breakage for "now doing it the
right way", but some of it is just gratuious breakage (e.g. ipchains).
Linux is still my "OS of choice", but one does pay a tax as an
end-user, and as more of us depend on Linux critically I feel the
problem will get bigger.
Question: should a developer (i.e. a library or kernel developer)
spend 4-5 hours of his time to keep binary compatibility or should
300,000 linux users spend 15 minutes of their time (hopefully) getting
everything to work again? I understand that it depends. I hope people
don't think it should always be the latter one...
Karl
**********************************************************
To unsubscribe from this list, send mail to
[EMAIL PROTECTED] with the following text in the
*body* (*not* the subject line) of the letter:
unsubscribe gnhlug
**********************************************************