On 12/20/2007 07:56 PM,, Richard Loosemore wrote:
>
> I think these are some of the most sensible comments I have heard on
this list for a while.  You are not saying anything revolutionary, but
it sure is nice to hear someone holding out for common sense for a change!
>
> Basically your point is that even if we just build an extremely fast
version of a human mind, that would have astonishing repercussions.

Thanks. I agree that even if it could do nothing that humans cannot, it
would have astonishing capabilities if it were just much faster. Von
Neumann is an especially good example. He was not in the same class of
creative genius as an Einstein or a Newton, but he was probably faster
than the two of them combined, and perhaps still faster if you add in
the rest of Einstein's IAS buddies as well. Pólya tells the following
story: "There was a seminar for advanced students in Zürich that I was
teaching and von Neumann was in the class. I came to a certain theorem,
and I said it is not proved and it may be difficult. Von Neumann didn't
say anything but after five minutes he raised his hand. When I called on
him he went to the blackboard and proceeded to write down the proof.
After that I was afraid of von Neumann" (How to Solve It, xv).

Most of the things he is known for he did in collaboration. What you
hear again and again that was unusual about his mind is that he had an
astonishing memory, with recall reminiscent of Luria's S., and that he
was astonishingly quick. There are many stories of people (brilliant
people) bringing problems to him that they had been working on for
months, and he would go from baseline up to their level of understanding
in minutes and then rapidly go further along the path than they had been
able to. But crucially, he went where they were going already, and where
they would have gone if given months more time to work. I've heard it
said that his mind was no different in character than that of the rest
of us, just thousands of times faster and with near-perfect recall. This
is contrasted with the mind of someone like Einstein, who didn't get to
general relativity by being the fastest traveler going down a known and
well-trodden path.

How does this relate to AGI? Well, without even needing to posit
hitherto undiscovered abilities, merely having the near-perfect memory
that an AGI would have and thinking thousands of times faster than a
base human gets you already to a von Neumann. And what would von Neumann
have been if he had been thousands of times faster still? It's entirely
possible that given enough speed, there is nothing solvable that could
not be solved.

(I don't mean to suggest that von Neumann was some kind of an
idiot-savant who had no creative ability at all; obviously he was in a
very small class of geniuses who touched most of the extant fields of
his day in deep and far-reaching ways. But still, I think it's helpful
to think of him as a kind of extreme lower bound on what AGI might be.)

>
> By saying that, you have addressed one of the big mistakes that people
make when trying to think about an AGI:  the mistake of assuming that it
would have to Think Different in order to Think Better.  In fact, it
would only have to Think Faster.

Yes, it isn't immortality, but living for a billion years would still be
very different than living for 80. The difference between an
astonishingly huge but incremental change and a change in kind is not so
great.

> The other significant mistake that people make is to think that it is
possible to speculate about how an AGI would function without first
having at least a reasonably clear idea about how minds in general are
supposed to function.  Why?  Because too often you hear comments like
"An AGI *would* probably do [x].....", when in fact the person speaking
knows so little about about how minds (human or other) really work, that
all they can really say is "I have a vague hunch that maybe an AGI might
do [x], although I can't really say why it would...."
>
> I do not mean to personally criticise anyone for their lack of
knowledge of minds, when I say this.  What I do criticise is the lack of
caution, as when someone says "it would" when they should say "there is
a chance  that it might"
>
> The problem is, that 90% of everthing said about AGIs on this list
falls into that trap.
>

I agree that there seems to be overconfidence in the inevitability of
things turning out the way it is hoped they will turn out, and lack of
appreciation for the unknowns and the unknown unknowns. It's hardly
unique to this list though to not recognize the contingent nature of
things turning out the way they do.

-joseph

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=78316106-039103

Reply via email to