Mark Waser wrote:
Richard Loosemore wrote:
To say to an investor that AGI would be useful because we could use
them to build travel agents and receptionists is to utter something
completely incoherent.
Not at all. It is catering to their desires and refraining from
forcibly educating them. Where is the harm? It's certainly better than
getting the door slammed in your face.
I think this is a mistake. Selling investors the idea of replacement
travel agents and housemaids is something that they know, in their gut,
is a stupid idea IN THIS CONTEXT. The context is that you are saying
that you will build something with the completely general powers of
thought that a person has. If you can build such a thing, then claiming
that it will be used for a trivial task after (e.g.) $100 million of
development money would make no business sense whatsoever.
A big part of being coherent in front of an investor is being able
to think your idea through to its logical conclusion. Trying to
soft-pedal the idea and pretend that it will be less useful than it
really is is considered to be just as bad as overselling the idea -
this is "thinking too small". Either way, you look as if you haven't
really thought it through.
Here is what I would call thinking it through.
The definition of AGI is that it has all the powers of thought that we
have, rather than being able to answer questions about a blocks world
perfectly, but be completely incapable of talking about the weather. We
all agree on this, no?
With that understood, there are some obvious consequences to building an
AGI. One is that we will be able to duplicate a machine that has
acquired expert-level knowledge in its field. This is a stupendous
advance on the situation today, obviously, because it means that if an
AGI can be taught to reach expert level in some field, it can be
duplicated manyfold and suddenly we have a vast army of people pushing
back the frontiers together.
Now the question is whether it will be so much harder to produce a
housemaid than a medical expert. It is not at all obvious that the
housemaid or travel agent will be a step on the road. If we can
understand how to make something think, why would our efforts happen to
land on the intelligence-point that equates to travel agent? Just
because this is the kind of work that a human is forced to do when they
cannot get anything better, does not mean that this is a natural level
of intellectual capacity. The first AGI could just as easily be a
blithering idiot, an idiot-savant, a rocket scientist or an
unsurpassable genius. To ask it to be a travel agent is to assume that
what you build will have a very particular level of intelligence, and be
incapable of improvement, so it would beg the question "Why would it
only reach that level?".
I think, in truth, that this talk of using the first AGIs as travel
agents and housemaids is based on a weak analysis of what it would mean
to produce an early prototype or a step-on-the-road to full AGI.
Because we have in our minds this picture of human beings and the way
they develop, some people are automatically assuming that an early AGI
would be equivalent to a housemaid. What I am saying here is that this
is by no means obvious, at the very least.
I think that if we can build such thinking machines, we would
surely by that stage have come to understand the dynamics of
intellectual development in ways that we have no hope of doing today:
we will be able to look inside the developing mind and see what factors
enable some thinkers to have trouble getting their thoughts together
while others zoom on to great heights of achievement. Given that we
will be able to do that, we will have much greater chance of being able
to produce something that can continue to develop without hitting a
roadblock of some kind. In my opinion, what makes a travel agent a
travel agent is not a lack of horsepower, but a complicated interaction
of drives and social interactions (as well as some contribution from
lack of horsepower). A travel agent, in other words, is more like a
genius who got stopped along the way, than a person whose brain simply
did not have the right design.
Richard Loosemore
-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com