I wrote:

And when they say the performance is "variable", I think they're talking about some measure of performance during a single execution of a given program, not about repeating the execution of the same program multiple times and finding variations from one run to another.

Looks like I was wrong about this part--according to the article at http://www.newscientist.com/article.ns?id=mg18725074.600 'The team ran a standard program repeatedly on a simulator which engineers routinely use to design and test microprocessors, and found that the time taken to complete the task varied greatly from one run to the next. But within the irregularity, the team detected a pattern, the mathematical signature of "deterministic chaos", a property that governs other chaotic systems such as weather. Such systems are extremely sensitive - a small change at one point can lead to wide fluctuations at a later time. For complex microprocessors, this means that the precise course of a computation, including how long it takes, is sensitive to the processor's state when the computation began'

Still, as long as the output would be the same for the same input, this wouldn't make computations unrepeatable.

Jesse


Reply via email to