Jiri Jelinek wrote:
On Nov 3, 2007 1:17 PM, Richard Loosemore <[EMAIL PROTECTED]> wrote:
Isn't there a fundamental contradiction in the idea of something that
can be a "tool" and also be "intelligent"?

No. It could be just a sophisticated search engine.

What I mean is, is the word "tool" usable in this context?

IMO yes.

Question: do you believe it will really be possible to build something that is completely intelligent -- smart enough to understand humans in such a way as to have conversations on the subtlest of subjects, and being able to understand the functions of things in our world, even though those functions sometimes are defined by the most subtle of human behaviors/preferences/whims -- and yet at the same time, only be a sophisticated search engine?

I think that if it were dumb enough that it could be treated as a tool, then it would have to no be able to understand that it was being used as a tool.

And if it could not understand that, it would just not have any hope of being generally intelligent.

There is much more to this line of attack, but do you see where I am coming from?

Do you think that the apparent contradiction can be resolved?


Richard Loosemore



To put it the other way around, consider the motivational system of the
best kind of AGI:  it is motivated by a balanced set of desires that
include the desire to explore and learn, and empathy for the human
species.  By definition, I would think, this simple cluster of desires
and empathic motivations *are* the things that "give it pleasure".

In short, software (if that's still what we are talking about) needs
commands and rules (not "desires" and "pleasure") to do what we want
it to do.

But the thing is, you can change your mind to go and get pleasure in a
different way sometimes.  For example, you could decide to transfer your
mind into the cognitive system of an artificial tiger for a week, and
during that time you would get pleasure from stalking and jumping onto
predator animals, or basking in the sun, or meeting lady tigers.  After
automatically being yanked back into human mental form again at the end
of the holiday, would you say that "you" get pleasure from hunting
predators, etc? Do you get pleasure from the idea of [exploring
different sensoria]?

Different activities/inputs stimulate our pleasure center (a set of
brain structures) in different ways, moving us through the pleasure
scope. When we learn how to fully control [and improve] our pleasure
center then, I suppose, indirect stimulations through real life
scenarios (as we know it today) will become less desirable &
eventually not preferred. After figuring out the "measure pleasure"
problem, coolest pleasure wave generators will be researched and the
real life sensation will be just unable to compete with that.

Regards,
Jiri Jelinek

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=60960332-1efa4d

Reply via email to