*sigh* ended up with another 1400-word boat anchor.. caveat
lector. once upon a time, i used to have trouble /starting/ to
write.
> Now a followup question -- why do we have this "all or nothing"
> mindset towards chips? If the only people who "get" anything out
> of this new chip set are "gamers" -- why not market a chip to
> them only? And a less expensive chip towards the home market? Or
> one that's optimized for file serving for *that* market?
>
> Is it a mass production thing? That is, are the economies of
> scale such that specilty chips would cost a lot more to produce?
i'm sure the production costs would be considerable.. as i understand
it, the cost of setting up a new fabrication plant are roughly the same
as setting up a production line for a new type of car.
there are also some important R&D issues to consider. a lot of what
happens in chip development involves taking a breakthrough in the lab..
i.e: "after ten months of voodoo, we built a prototype that performed a
single operation at 900HMz".. and turning it into an industrial
process.. i.e.: "this plant's production capacity is 10,000 units a day
at 99.5% deliverable quality." it's hard enough to do that with a
single design, and adding variations increases the work geometrically.
the biggest obstacle, though, is the nature of the computer itself.
deep down at its most fundamental level, the computer is the ultimate
chameleon.. it's a machine which can pretend to be almost any other type
of machine. in a very real way, software developers are machine
designers, it's just that our machines are built out of logic, not
matter.
the soul of a computer is a mathematical abstraction which has gone by
half a dozen different names over the last century.. universal Turing
machines, Post systems, type-zero grammars, mu-recursive functions,
lambda calculus, combinatory logic, blah, blah, blah. the essential
idea is a system of logic which can be used to build a complete
description of itself. thing is, it only takes about three concepts..
assignment, sequential execution, and conditional branching.. to build a
complete Turing machine. everything else is just cosmetic detail.
every computer in existence implements those operations, which means
that every computer is, at its core, a universal Turing machine.
mathematics being what it is, any two things equal to the same thing are
equal to each other. that's what makes the computer an anything box..
any system or process which can be described with the right kind of
logic (and it's a pretty general-purpose logic) can be implemented on a
computer. one practical side-effect, though, is that there's no
calculation which can be done on the latest, greatest, funkiest,
mumble-X Pentium which can't also be done on an Apple ][. there's no
difference in the processing capacity of different CPU designs.. only in
the speed with which they do their job, and the features they offer to
make low-level programmers' lives easier.
these days, the only people who touch machine code are the guys who
write back-ends for compilers and the really obsessive sort of game
designer, so only a very select market cares about that. therefore,
the only standard of measurement left to the consumer is execution
speed. if it were just a question of how fast various processors can
perform various operations, comparing them would be simple.
unfortunately, what users see is the execution speed of the machine
*simulated by* the CPU, which has almost no measurable relation to the
capacity of the CPU itself. there are two other factors which have a
much greater effect on performance, one practical, one theoretical.
the practical one is the quality of the software design.. yes, the speed
of a CPU determines the maximum capacity of the machine, just as the
design of a vehicle determines the maximum speed at which it can move.
OTOH, the existence of a well-defined maximum doesn't do anything to the
*minimum* performance limit, which is always zero. in car terms, given
a sufficently lousy driver and mechanic, you can make a Lamborghini
perform just as badly as a Yugo. the only difference is that it's a
lot easier to get away with being a rotten mechanic when your machine is
completely invisible, and users blame the hardware for your stupidity.
the theoretical limit on CPU performance is called the Von Neumann
Bottleneck on processor utilization. computers are more than just
CPUs, they also have data storage issues to consider. the Von Neumann
bottleneck says that it doesn't matter how fast your CPU is, its
performance is limited by the rate at which it gets data from RAM. the
Von Neumann bottleneck can completely overwhelm the speed of the
processor, and there's no theoretical way to get rid of that limit.. all
you can do is work around it, which takes us back to the issue of
design.
as i mentioned in the previous post, the data transfer rate on the
motherboard is maybe 1/20th to 1/100th of what you can get inside the
CPU. if your program has to spend a lot of time reading and writing to
RAM, the effective speed of your CPU is reduced to a fraction of its
theoretical maximum.
the problem is made worse by the fact that most of today's CPUs use
multiple processing units that operate in paralell.. it's called
'superscalar multiprocessing'. what it means is that any given
processing unit in your CPU has a maximum speed of maybe 66MHz, it's
just that you have four or five of them working as a team. that's
great when they're all crunching numbers by themselves, but can get to
be a real problem if they all need to pull information from RAM at the
same time. in the worst-case scenario, your 333MHz processor can have
the same performance as a single 66MHz processor. ideally, that won't
happen often, but it's something you have to watch out for.. which once
again takes us back to design.
the upshot is that it's possible to create two programs from the same
code base, which perform exactly the same sequence of operations and
have only minor variations at the source level, and one of them will
still run ten times as fast as the other on the same machine.
that was an awfully long-winded way of saying "consumers wouldn't be
able to tell the difference", but i wanted to show how the emphasis
rests on the nature of the computer and its software, not on some
imaginary laziness or stupidity among consumers. there is just no way
to make meaningful statements about differences in performance between
two computers with different CPUs unless you include a very thick (and
very dull) book full of additional information about the rest of the
hardware, the design of the software, and the nature of the compiler you
used. even then, your results are so specific that the report becomes
worthless if you change one of the umpty-seven parameters.
consumers have enough to worry about just trying to decide among all the
other options on computers.. how much RAM; CD, DVD or R/W-CD; Zip or
Superdisk; what sound and graphics cards; what kind of modem or network
card; PCI or AST, serial or USB; SCSI, IDE, or Firewire? adding
another layer of complexity like "optimized for word processing and
networking" vs. "optimized for raster graphics and spreadsheets" would
be more than the general capacity for disbelief could bear.
we'll see specialization as time goes on, but it will fall under the
heading of 'information appliances' rather than general computing. in
terms of the gamer market it already exists, with things like the Sony
PlayStation. those represent systems which are highly optimized for
graphics processing, and which have sacrificed most of the storage
features are considered a part of standard computers.
we'll probably see a rise in "OS-free" dedicated fileservers as time
goes on, and the Network Computer is basically just a PlayStation with a
keyboard that loads programs through its network port rather than from a
game cartridge.. though the 'smart card' manifestation of the NC keeps
the card slot for user-specific data.
there's still just too much use for a general, all-purpose Anything
Machine, though.. that's the nature of the beast. and since that's
it's nature, that's what hardware designers will continue to shoot for.
mike stone <[EMAIL PROTECTED]> 'net geek..
been there, done that, have network, will travel.
____________________________________________________________________
--------------------------------------------------------------------
Join The NEW Web Consultants Association FORUMS and CHAT:
Register Today at: http://just4u.com/forums/
Web Consultants Web Site : http://just4u.com/webconsultants
Give the Gift of Life This Year...
Just4U Stop Smoking Support forum - helping smokers for
over three years-tell a friend: http://just4u.com/forums/
To get 500 Banner Ads for FREE
go to http://www.linkbuddies.com/start.go?id=111261
---------------------------------------------------------------------