On 7/12/06, Andrew Lentvorski <[EMAIL PROTECTED]> wrote:
Paul G. Allen wrote:
>> Sigh.  I have been hearing about the asynchronous processor thing for
>> the last 15 years.  It is no closer than it has ever been.
>

Flash has been around for 30 years right?
So have gyroscopes, and they are just coming out with semiconductor
based MEMS gyros that are a tenth the costs of previous ones. There's
more to something that just innovation. We live in a capitalist
world.....it can't just be better, it has to be practical and cost
effective.

> http://www.arm.com/news/12013.html

So, they announced it this year.  And the previous year.  And the
previous year before that.

And 4 gazillion cell phones still ship with a clocked ARM9.  Gee, I
wonder why?  Perhaps because it's not a win?


That sounds like a great reason to keep trying. The fact is it's been
proven that these designs take less power, but clearly there are other
issues at hand which affect their usability in the broader market.

Cell phones are *very* sensitive to power.  If it were a win, they would
switch in one generation.


If that was the only thing that mattered.

> Most processors spend most of their time idle. Power consumption is not
> the only reason to remove clock signals.

It's the main one.  Modern processors spend 50%+ of their power just
powering the clock grid.


Can we get a source for that? Perhaps qualify the class and type of
processor? Sounds like intel netburst architechture to me (if it's
even true.... in fact I KNOW that at the 65nm process node and below
most of the power goes to leakage currents). This is another case
where clock speed makes a big difference. Power consumption (exclusive
of leakage) rises linearly with core voltage but squared relative to
clock speed. That number seems bogus.....the reality is that these
chips save power because they don't use their parts that aren't
necessary, and usually a clock signal is what defines whether a
"block" is "on" or "off" in a chip to save power. Notice how all the
chip makers just stop emphasizing megahertz? Now might be a good time
to get off the synchronus bandwagon.

> Running clock signals all over a chip is expensive in timing
> (propagation delays), power consumption, physical space, and other
> areas. One of the biggest problems with synchronous systems is the
> propagation delays imposed upon the timing signals.

You call it a problem; I call it an advantage.


Advantage huh? is that coming from a circuit designing background? All
of the aformentioned problems proliferate as you go faster and faster,
and the reality is that the baseband and common parts of a cell phone
don't run at that high of a frequency, but as you get faster and
faster these things become a real issue. Can i make up a word to call
it too?

The clock provides a hard, temporal boundary.  You must get done within
that time, or it's a bug.  That is the task of engineering.  Decoupling
a complex problem into manageable pieces.


Yeah.......whatever you do....don't think outside the box, cause then
you might be a scientist right? I'm currently working at a company
that has gotten CMOS photonics working 5 years ahead of intel, and
many others, by fusing engineering and science. You seem to be almost
orating outdated philosophy, and I doubt a single person in the world
who's designed an ASIC will tell you they like clocks, and all the
challenges associated with them. Doesn't mean the problems are EASIER
with asynch, but the benefits are clear. Maybe most people cant even
comprehend breaking a paradigm like clocked logic....mostly I just
think analog/RF/microwave electronics methodologies scare most people.
That's why there's lotsa digital guys, and lotsa programmers, and
probably 90% of programmers out there are windows application
programmers, but very few of these people are innovation based
engineers, they are tools based engineers, so let's not degrade the
work done by "engineers" or "scientists" (whichever your prefer) at
cutting edge companies where all the real stuff happens. I'm not gonna
lie, I think that's a startlingly shallow view right there that
doesn't have any objective foundation.

Let's be straight....clocks are for synchronization, so you know when
the train is going to be there, or when to show up for your meeting.
The firebrigade (common asynch example) just waits for the water
bucket to get put in their hand. There's not that many situations
where having a clock, or lines to guide you make things
BETTER.....just easier. Now, however that we have computers to handle
many of the more difficult issues with asynch logic, much like we'll
all have auto-driving cars in not to long that make the lines on the
road worthless, clocks don't really matter all that much. And just so
you know, mostly you design a whole circuit before you actually
simulate to figure out how fast it can go. It's not the starting
point....it's the end.

Look at some of the most reliable software.  Gee, things like vxWorks on
the Mars rover tend to use interrupts as a global "clock".  I wonder

Ever hear of a watchdog timer? Sampling rate? Frequency counting with
interrupts? EVERY SINGLE RTOS USES A TIMER INTERRUPT AS A TIMER TICK.
When your chip runs at 50MHz incrementing a value in a register every
200 million clock cycles (frequency division) is a pretty darn easy
way to get a more practical clock for real world operations. Let's not
mistake common sense with whatever it was you were trying to say here.

why?  Perhaps because it serves as a temporal boundary and you can make
guarantees?  Sound familiar?


Great how one example proves it all. Actually they used 3 FPGAs that
checked each others work on every step, and when one made an error it
was instantly re-programmed because likely the error came from
radiation messing with the circuits. Ever learn calculus? Weird how
it's about dealing with transitioning from discrete numbers to analog
and conitnuous reality that allowed us to mathematically express
concepts much more completely.

Why hardware wants to be more like software in reliability is completely
beyond me.


Who said that? Software is much less deterministic and reliable than
hardware, and the great thing about most software, is that these days
people just throw resources at it left and right. Most of my coding
has gone into genetic analysis (computationally hard), speech
recognition (computationally hard), and embedded medical devices
(computationally limited), and the reality is that's what I consider
being a software engineer. What most people really want, and digital
design is a good example of this, is making a hard task easier like
programming most applications etc. Let's be realistic (my philosophy
tangent here) in 20 years 90% of people will be programmers.....it's
just what you program. Methodologies take you most of the way there,
but if you're going to do something truly novel, you're gonna have to
use your head. You're gonna have to do experements, because you're
doing something that's never been done before. And your gonna have to
be wrong sometimes because that's how you learn.

-a

Sorry for the vehmence, it's just passion not angst, and as for why
this issue belongs on a linux list, I don't know.......but it's hard
to let someone make a bunch of unrealistic, unsubstantiated, and
philosophical claims about something you know a lot about. Stating
opinions as fact is not fair, nor helpful to anyone.

-Tom

--
-Thomas Gal
http://www.enigmatecha.com/thomasgal.html


--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to