[EMAIL PROTECTED] wrote:
That C and not assembler ought to be the target language, no matter
the application. That assembler is deprecated in favour of C.
If that ever comes to pass, I'm going back to a wire-wrap tool.
I don't see why. I did post the sum-total of assembler coding in the
Plan 9 source directory, libraries and kernel code excluded. Has that
sent you back to the wire-wrap tool yet?
No - I am not advocating 'shedding' C - just opposed to mandating it or
'settling' for it to the exclusion of other tools.
Absent machine code or asm, there are things that would *have to* be done
physically in mask, wire-bonding, or conductor paths.
Nobody said that C had to be the highest common denominator, only that
it should be the lowest, instead of asm.
Ok - let's say if not C as-we-know-it, then something so close in 'span' as to
be functionally the same. 'D' is close, few others are (check the speed alone).
A frustration is that the very things that most need a bullet-proof
implemntation - kernel and device drivers - are the hardest to get right at any
usable speed in anything OTHER THAN C.
Yet - despite long-available tools anto prevent it we are still seeing 'buffer
overrun...' security holes - and in new code, not just overlooked legacy code.
'D' - whatever else is good or bad about it - at least closes that one off a bit
better.
There *can be no* one-size fits-all final answers unless and until all progress
is to be called off and stagnation and decline to the death are mandated from
on-high.
That is a view from an uncommon position. It so happens that our
brain can "evolve" much more rapidly than any other organism, with
mutations occurring on a very small timescale. But that perspective
is unique to that particular condition. And one can philosophise on
how useful this continuous mutation really is.
Not even for biologicals with billion+ year history.
Ever wondered how old the nearest amoeba is? From his point of view,
we're just a passing phase :-)
LOL! But that amoeba is not the same as his remotest ancestor, either, if only
w/r salinity, temperature, mineral and dissolved gas level adaptation.
'adapt or die' may have long cycle, but it is an unforgiving one.
As you go higher in evolutionary complexity, this becomes more
important, but it's an artifact, not a natural principle. It stems
from organic complexity that is more dependent on active conditions.
The ability to deal with environmental change without need to mutate
seems to me to be more powerful than the ability to mutate at the
slightest whim.
Point. Though radiodurens is far older than we are, and better traveled it
seems..
C is one such paradigm. Consider that early versions of Windows (up
to 3.1, perhaps) were written in Pascal; at the time that was
Microsoft's bet for the future. C took over from Pascal and only
Microsoft can document the pain and gain of moving to it.
Well - if we have to start talking trash, let's pick a better-grade of trash
than WinWOES. That's the software equivalent of the feces of common housefly -
found nearly everywhere, but not welcome in MY coffee cup!
;-)
> That C is
still around today, one dares say _despite_ interference from various
well-meaning committees, speaks volume to the genius of its inventors.
Partially, yes. But google a bit and find those inventors have publically
expressed mixed feelings about that measure of 'success' at one time or another.
The huge code base can be as stifling to improvement as it is useful.
Inertia cuts both ways.
I don't think it was blind luck, I think it was genius. That
something may eventually supersede C is unarguable, but I think it
will take a very large paradigm shift to make that possible or
necessary.
++L
ACK. It is already 'necessary' in my view, or at the least 'worth attempting'
with greater effort.
And that if only to reduce debug labor - and time spent on it.
Look at Boeing's rationale for insisting on Ada - and mind - I have no great
love for Ada - but the numbers - and the short Sunstrand retraining cycle are
eye openers.
When a language becomes as ubiquitous as C has, you have to take cognizance of
the fact that this means a lot more less-skilled coders have come on board, then
either modify the language so they are less likely to screw it up - or change
the toolset altogether and restrict 'C use to the very best coders or at least
the closest scrutiny.
The *bucks* involved in debugging - or even managing - large team efforts in 'C'
are getting very serious - especially on projects where time is of the essence.
'Too soon we forget' that C as used in Unix/Linux/Plan9 has had a *very* long
and oft painful devel cycle to get here. And we are still fixing old work.
So 'genius'?
At the time, perhaps.
By by now, perhaps as much a product of widely available libs and examples,
dogged determination and 'many eyes, many hands'. Not to mention LESSER
availability of alternatives.
To an extent that is akin to WinWOES pushing out other options merely by
displacement.
C is not going to 'go away' any time soon - but neither is it necessarily as
good a tool for the next 10-20 years as it was the last.
Bill