On Mon, Jul 11, 2005 at 04:45:21PM -0400, Jesse Mazer wrote:

> I don't think that paper is talking about computations being 
> nonrepeatable--they say that they're not talking about "stochastic 
> variations" (which I think refers to genuine physical sources of 
> randomness), but instead about some type of deterministic chaos. Since it's 
> deterministic, presumably that means if you feed exactly the same input to 
> exactly the same program it will give the same results, the "sensibility to 

It is quite common that even different compiler optimization flags (nevermind 
different architectures) result in
very different trajectories in numerical simulation (e.g. MD is very
susceptible to a nonlinear/butterfly effect). 

> initial conditions" probably just means if you change a single bit in the 
> input the output will be very different, something along those lines. And 
> when they say the performance is "variable", I think they're talking about 
> some measure of performance during a single execution of a given program, 
> not about repeating the execution of the same program multiple times and 
> finding variations from one run to another.

Eugen* Leitl <a href="http://leitl.org";>leitl</a>
ICBM: 48.07100, 11.36820            http://www.leitl.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE

Attachment: signature.asc
Description: Digital signature

Reply via email to