On Jan 24, 2008, at 10:25 AM, Richard Loosemore wrote:
Matt Mahoney wrote:
--- Richard Loosemore <[EMAIL PROTECTED]> wrote:
The problem with the scenarios that people imagine (many of which are Nightmare Scenarios) is that the vast majority of them involve completely untenable assumptions. One example is the idea that there will be a situation in the world in which there are many superintelligent AGIs in the world, all competing with each other for power in a souped up version of today's arms race (s). This is extraordinarily unlikely: the speed of development would be such that one would have an extremely large time advantage (head start) on the others, and during that time it would merge the others with itself, to ensure that there was no destructive competition. Whichever way you try to think about this situation, the same conclusion seems to emerge.
As a counterexample, I offer evolution. There is good evidence that every living thing evolved from a single organism: all DNA is twisted in the same
direction.

I don't understand how this relates to the above in any way, never mind how it amounts to a counterexample.

If you're actually arguing against the possibility of more than
one individual superintelligent AGI, then you need to either
explain how such an individual could maintain coherence over
indefinitely long delays (speed of light) or just say up front
that you expect magic physics.

If you're arguing that even though individuals will emerge,
there will be no evolution, then Matt's counterexample applies
directly.

--
Randall Randall <[EMAIL PROTECTED]>
"If we have matter duplicators, will each of us be a sovereign
 and possess a hydrogen bomb?" -- Jerry Pournelle


-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=89499376-fa3d11

Reply via email to