In a message dated Tue, 18 Dec 2001  9:12:16 PM Eastern Standard Time, "Alberto 
Monteiro" <[EMAIL PROTECTED]> writes:

> 
> The Fool wrote:
> >
> >But that is not the issue.  You create a significantly advanced (and
> >probably slightly bug prone) system that is designed to make a better
> >system.  That system should hopefully be bug free (if the original
> >programmers didn't f%&^ up severely).  That system in turn creates an
> >even more advanced system that should be even more bug free (because
> >computers are not prone to the kinds of mistakes that humans make). Even
> >if that system was just a pure rewrite of itself, that should be enough
> >to create a perfect system within a few iterations.  Run the system for a
> >few hundred thousand iterations (just to be sure) and you would have a
> >system that became infalliable.  Also that system would become more and
> >more efficient with each iteration.
> >
> Uh?
> 
> This process will surely increase *one* efficience - it would make
> reproducing faster. Natural selection would quickly operate, and
> the googol-th generation [*] would be as simple as a virus, and
> it would *not* be bug-free, it would be a naked bug.
> 
> I think Alberto has hit the nail on the head. Self dupicating digital systems do 
>exist. DNA is one. But it is not error free. If copying were perfect than there would 
>be no evolution. So there will always be mistakes (mutations) in a system. And as 
>systems become more complex they have more mistakes. There uppper limits of genetic 
>complexity for various types of organisms that have a variety of error/reduction 
>correction tools. I woudl the same would be true for machine environments. Selection 
>experiments have been run in virtual environments and the same rules seem to hold as 
>in natural selection. 

Reply via email to