> From: Alberto Monteiro <[EMAIL PROTECTED]> > > The Fool wrote: > > > >But that is not the issue. You create a significantly advanced (and > >probably slightly bug prone) system that is designed to make a better > >system. That system should hopefully be bug free (if the original > >programmers didn't f%&^ up severely). That system in turn creates an > >even more advanced system that should be even more bug free (because > >computers are not prone to the kinds of mistakes that humans make). Even > >if that system was just a pure rewrite of itself, that should be enough > >to create a perfect system within a few iterations. Run the system for a > >few hundred thousand iterations (just to be sure) and you would have a > >system that became infalliable. Also that system would become more and > >more efficient with each iteration. > > > Uh? > > This process will surely increase *one* efficience - it would make > reproducing faster. Natural selection would quickly operate, and > the googol-th generation [*] would be as simple as a virus, and > it would *not* be bug-free, it would be a naked bug.
No. (See other post). Each iteration of the cycle would take more time because as the complexity of the program increases (probably at a linear rate) so too the number of things the program would have to test would increase (probably at a geometric rate). So even if it is increasing the power of the hardware in each cycle (unless hardware growth rate > program complexity growth rate), each iteration of the program would get slower with asymptote heading towards infinity.
