Norman Samish wrote: shows the following abstract,
suggesting that complex computations are not precisely repeatable.  Doesn't
Bruno's Computation Hypothesis imply that computations ARE precisely

"Modern computer microprocessors are composed of hundreds of millions of
transistors that interact through intricate protocols. Their performance
during program execution may be highly variable and present aperiodic
oscillations. In this paper, we apply current nonlinear time series analysis
techniques to the performances of modern microprocessors during the
execution of prototypical programs. While variability clearly stems from
stochastic variations for several of them, we present pieces of evidence
strongly supporting that performance dynamics during the execution of
several other programs display low-dimensional deterministic chaos, with
sensibility to initial conditions comparable to textbook models. Taken
together, these results confirm that program executions on modern
microprocessor architectures can be considered as complex systems and would
benefit from analysis with modern tools of nonlinear and complexity

I don't think that paper is talking about computations being nonrepeatable--they say that they're not talking about "stochastic variations" (which I think refers to genuine physical sources of randomness), but instead about some type of deterministic chaos. Since it's deterministic, presumably that means if you feed exactly the same input to exactly the same program it will give the same results, the "sensibility to initial conditions" probably just means if you change a single bit in the input the output will be very different, something along those lines. And when they say the performance is "variable", I think they're talking about some measure of performance during a single execution of a given program, not about repeating the execution of the same program multiple times and finding variations from one run to another.


Reply via email to