Unfounded how?
If you have 128 bits and some of them are changed randomly, the odds of a
match are 1 - (1/(2^128)). 2^128 is a rather large number, something
around the number of atoms in the universe if I recall correctly. Imagine
that every atom in the universe had exactly one identical atom. How long
would it take to find 2 identical atoms? The odds of having 2 sets of
random errors match is essentially 0. The odds of having 2 matches
involving larger data sets gets progressively smaller.
A redundancy of 2 can detect any random error. The third is required to
figure out which of the 2 was in error in the case of a discrepancy.
It did not say anything about systematic errors, just random errors like
those caused by overheating and overclocking. Systematic errors such as
those caused by programming errors or wide spread hardware errors have to
be found by other means, but each can be tested for once.
jm7
"Paul D. Buck"
<p.d.b...@comcast
.net> To
[email protected]
10/01/2009 02:44 cc
PM BOINC Developers Mailing List
<[email protected]>,
Raistmer <[email protected]>
Subject
Re: [boinc_dev] [boinc_alpha] Card
Gflops in BOINC 6.10
On Oct 1, 2009, at 6:41 AM, [email protected] wrote:
> Redundancy of 2 catches essentially ALL random errors Given the
> current
> number of computers, it would be more than a millennium before a
> single
> undetected random error was missed - assuming that there were only
> 128 bits
> of output / task.
Unfounded assertion. And essentially all is an unbounded and
unmeasured metric.
_______________________________________________
boinc_dev mailing list
[email protected]
http://lists.ssl.berkeley.edu/mailman/listinfo/boinc_dev
To unsubscribe, visit the above URL and
(near bottom of page) enter your email address.