RE: [agi] Cosmodelia's posts: Music and Artificial Intelligence;Jane;

2003-02-03 Thread Ben Goertzel
 Spirit isn't emergent, and isn't everywhere, and isn't a figment of the
 imagination, and isn't supernatural.  Spirit refers to a real thing,
 with a real explanation; it's just that the explanation is very, very
 difficult.

 --
 Eliezer S. Yudkowsky  http://singinst.org/

Well I think spirit is partially emergent...

And I'm not sure it's difficult to explain *in itself* -- in some ways it's
very simple

However, I agree that the explanation for how spirit connects with
intelligence appears to be very difficult.

And I would also venture this:

Experimenting with AGI's -- and with human neuromodification -- is going to
teach us a LOT about this thing we call spirit ;-)  !!

Ben G

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]



Re: [agi] Cosmodelia's posts: Music and Artificial Intelligence;Jane;

2003-02-02 Thread Eliezer S. Yudkowsky
[EMAIL PROTECTED] wrote:


I propose that *any* successful AGI design will be a design, instead
of in reason, *in the technology of spirit*.

Now, by spirit, I don't mean the Let's go pray to lunar crystals
snake oil-kind. I mean a real, all-encompasing, perspective on what it
means to be.

No doubt many AI'ers will claim spirit is irrelevant. No doubt some
will claim that spirit is an emergent property of general
intelligence. But I'm inclined (largely for rational reasons and
partly for intuition) to disagree. Regarding the first point, I don't
believe that spirit is irrelevant. Rather, I believe it is essential
to intelligence. You can't have AGI without human-GI's ability to
percieve spirit. Regarding the second point, I really doubt that
spirit could be an emergent property of AGI. How can the whole emerge
from the part unless the part already contains it?


The problem, as I'm sure you realized, is that you can't go up to an AGI 
researcher and say:  Build me an AI that includes spirit.  Well, you can 
take that physical action.  But if you say it one of the usual run of AGI 
researchers, one of two things will happen:

1)  The AGI researcher will say:  Spirit is poorly defined, not 
realizing that humans start out with an extremely poorly defined intuitive 
view of things like intelligence, spirit, emotion, reality, truth, et 
cetera, and that it is a necessary part of the job of creating AI to 
create clearly defined naturalistic views that completely, clearly, and 
satisfyingly encompass all these realms.

2)  The AGI researcher will invent a spur-of-the moment definition for 
spirit which doesn't really match what you're interested in - it doesn't 
provide a detailed naturalistic view which, when you see it, is fully and 
satisfyingly identifiable with what you were interested in.  But the AGI 
researcher will insist that the definition must embrace spirit for some 
reason or other, just as earlier AGI researchers insisted that search 
trees, three-layer neural networks, semantic nets, frames, agents, et 
cetera, carried within them all the complexity of the mind.

What you have is an intuition that something has been left undone in 
traditional models of AGI.  You don't know what's missing - you just know 
that it is.  This intuition tends to be missing in most AGI researchers. 
If they had the ability to tell when their theories were missing something 
critical they would not be launching AGI projects based on incomplete 
theories.  There is a selection effect at work; people who know they don't 
understand the mind don't become AI researchers.  But nonetheless, 
although you know something has been left undone, you don't really know 
*what* has been left undone.  You can't describe it in enough detail to 
create it; if you could do that you would be a (true) AGI creator 
yourself.  All you can do is insist that something is missing.  And it is 
a sad fact that you will not get very far with this insistence - even 
though you are, for the record, correct.

I think I understand what you're calling spirit.  I think I understand 
it well enough to deliberately transfer it from humans, where it exists 
now, into another kind of rational-moral empirical regularity called a 
Friendly AI.  I would say, from within that understanding, that you are 
correct in that spirit is not emergent... very little is, really. 
Emergence in AGI is mostly an excuse for not understanding things.  If 
you don't understand something, you can't create it, and you certainly 
can't know in advance that it will emerge.

In the physical natural sciences, where emergence really is the most 
useful paradigm for understanding most phenomena, you would be laughed out 
of the lecture hall if you showed a picture of the Sun's corona and 
insisted that you'd finally got the corona all figured out:  It's an 
emergent phenomenon of the Sun!  But of course this explanation tells 
us nothing about *how* the corona emerges from the Sun and whether we're 
likely to see an analogous phenomenon appear in a glass of water or a 
Bose-Einstein condensate.  All that emergence says is that the 
explanation doesn't invoke an optimization process such as cognitive 
design or evolutionary adaptation.  (Or to be specific, to say that A is 
naturally emergent from B is to say that given B, one does not need to 
postulate further information, or settings of physical variables, deriving 
from a cognitive or evolutionary optimization process, for A to result.)

Spirit isn't emergent, and isn't everywhere, and isn't a figment of the 
imagination, and isn't supernatural.  Spirit refers to a real thing, 
with a real explanation; it's just that the explanation is very, very 
difficult.

--
Eliezer S. Yudkowsky  http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

---
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to