On Thu, Sep 4, 2008 at 10:04 AM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>
> Hi Pei,
>
> I think your point is correct that the notion of "embodiment" presented by
> Brooks and some other roboticists is naive.  I'm not sure whether their
> actual conceptions are naive, or whether they just aren't presenting their
> foundational philosophical ideas clearly in their writings (being ultimately
> more engineering-oriented people, and probably not that accustomed to the
> philosophical style of discourse in which these sorts of definitional
> distinctions need to be more precisely drawn).

To a large extent, their position is an reaction to the 'disembodied'
symbolic AI, though they get the issue wrong. The symbolic AI is
indeed 'disembodied', but it is not because computers have no body (or
sensorimotor devices), but that the systems are designed to ignore
their body and their experience.

Therefore, the solution should not be "to get a (robotic) body", but
"to take experience into account".

> I do think (in approximate
> concurrence with your paper) that ANY control system physically embodied in
> a physical system S, that has an input and output stream, and whose input
> and output stream possess correlation with the physical state of S, should
> be considered as "psychologically embodied."  Clearly, whether it's a robot
> or a laptop (w/o network connection if you like), such a system has the
> basic property of embodiment.

Yes, though I'd neither say "possess correlation with the physical
state" (which is the terminology of model-theoretic semantics), nor
"psychologically embodied" (which still sounds like a second-rate
substitute of "physically embodied").

> Furthermore S doesn't need to be a physical
> system ... it could be a virtual system inside some "virtual world" (and
> then there's the question of what properties characterize a valid "virtual
> world" ... but let's leave that for another email thread...)

Every system (in this discussion) is a physical system. It is just
that sometimes we can ignore its physical properties.

> However, I think that not all psychologically-embodied systems possess a
> sufficiently rich psychological-embodiment to lead to significantly general
> intelligence....  My suggestion is that a laptop w/o network connection or
> odd sensor-peripherals, probably does not have sufficiently rich
> correlations btw its I/O stream and its physical state, to allow it to
> develop a robust self-model of its physical self (which can then be used as
> a basis for a more general phenomenal self).

That is a separate issue.  If a system's I/O devices are very simple,
it cannot produce rich behaviors. However, the problem is not caused
by 'disembodiment'. We cannot say that a body much reach a certain
complexity to be called a 'body'.

> I think that Varela and crew understood the value of this rich network of
> correlations, but mistakenly assumed it to be a unique property of
> biological systems...

Agree.

> I realize that the points you made in your paper do not contradict the
> suggestions I've made in this email.  I don't think anything significant in
> your paper is wrong, actually.  It just seems to me not to address the most
> interesting aspects of the embodiment issue as related to AGI.

Understand.

Pei


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to