On 12/30/2012 10:49 PM, Paul D. Fernhout wrote:
Some people here might find of interest my comments on the situation
in the title, posted in this comment here:
http://slashdot.org/comments.pl?sid=3346421&cid=42430475
After citing Alan Kay's OOPSLA 1997 "The Computer Revolution Has Not
Happened Yet" speech, the key point I made there is:
"Yet, I can't help but feel that the reason Linus is angry, and
fearful, and shouting when people try to help maintain the kernel and
fix it and change it and grow it is ultimately because Alan Kay is
right. As Alan Kay said, you never have to take a baby down for
maintenance -- so why do you have to take a Linux system down for
maintenance?"
Another comment I made in that thread cited Andrew Tanenbaum's 1992
comment "that it is now all over but the shoutin'":
http://developers.slashdot.org/comments.pl?sid=3346421&threshold=0&commentsort=0&mode=thread&cid=42426755
So, perhaps now we finally twenty-years see the shouting begin as the
monolithic Linux kernel reaches its limits as a community process? :-)
Still, even if true, it was a good run.
The main article can be read here:
http://developers.slashdot.org/story/12/12/29/018234/linus-chews-up-kernel-maintainer-for-introducing-userspace-bug
This is not to focus on personalities or the specifics of that mailing
list interaction -- we all make mistakes (whether as leaders or
followers or collaborators), and I don't fully understand the culture
of the Linux Kernel community. I'm mainly raising an issue about how
software design affects our emotions -- in this case, making someone
angry probably about something they fear -- and how that may point the
way to better software systems like FONC aspired to.
dunno...
in this case, I think Torvalds was right, however, he could have handled
it a little more gracefully.
code breaking changes are generally something to be avoided wherever
possible, which seems to be the main issue here.
sometimes it is necessary though, but usually this needs to be "for a
damn good reason".
more often though this leads to a shim, such that new functionality can
be provided, while keeping whatever exists still working.
once a limit is hit, then often there will be a "clean break", with a
new shiny whatever provided, which is not backwards compatible with the
old interface (and will generally be redesigned to address prior
deficiencies and open up routes for future extension).
then usually, both will coexist for a while, usually until one or the
other dies off (either people switch to the new interface, or people
rebel and stick to the old one).
in a few cases in history, this has instead leads to forks, with the old
and new versions developing in different directions, and becoming
separate and independent pieces of technology.
for example, seemingly unrelated file formats that have a common
ancestor, or different CPU ISA's that were once a single ISA, ...
likewise, at each step, backwards compatibility may be maintained, but
this doesn't necessarily mean that things will remain static. sometimes,
there may still be a common-subset, buried off in there somewhere, or in
other cases the loss of occasional "archaic" details, will cause what
remains of this common subset to gradually fade away.
as for design and emotions:
I think people mostly prefer to stay with familiar things.
unfamiliar things will often drive people away, especially if they look
scary of different, whereas people will be more forgiving of things
which look familiar, even if they are different internally.
often this may well amount to shims as well, where something familiar
will be emulated as a shim on top of something different. even if it is
actually fake, people will not care, they can just keep on doing what
they were doing before.
granted, yes, when some people look into the "heart of computing", and
see this seeming mountain of things held together mostly by shims and
some amount of duct tape, they regard it as a thing of horror. others
may see it, and be like "this is just how it is".
luckily, it doesn't go on indefinitely, as often with enough shims, it
will create a sufficiently thick "layer of abstraction" to where it may
become more reasonable to rip out a lot of it, while only maintaining
the surface-level details (for sake of compatibility). compatibility may
be maintained, even if a lot of what goes on in-between has since
changed, and things can be extended that much longer...
granted, by this point, it is often less "the thing it once was" so much
as an emulator.
but, under the surface, what is the real-thing, and what is an emulator,
isn't really always all that certain. what usually defines an emulator
then, is not so much about what it actually does, but how much of a big
ugly seam there is in it doing so.
or such...
_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc