Many seem to think that once a machine is conscious it will have
volition or a will of its own. I've argued before that if a machine
does not operate with biological hardware like that of a human it will
not desire nor want anything, because it is a thinking entity, that in
the beginning stages of AGI, will run on steady state hardware that
lacks the ability feel--it can only compute. Once AGI is full swing or
a brain is scanned entirely and simulated, I see no ethical problem
with tinkering with the 'ghost in the machine'. At this stage I side
with Searle here.

We're emotional machines because we have an amygdala and other complex
molecular machineries, for instance. Once nanoassemblers are able to
build from the ground up an exact human replica: one that feels pain,
can become frightened, fall in love; someone that has the most
relation to us; to both think and feel, this is where ethical
considerations are challenged and great sci-fi in the interim is born.

Because the work of AGI mostly focuses on cognition and because of the
hardware AGIs are launched, I don't see any need to worry whether it
will want to willingly control, create, or destroy. Why would it in
and of itself? It is the humans that build and use it that most
concern me. The transition from computing hardware to emotional
hardware is something also to be concerned with, carefully and
rigorously simulated before its execution, so that we not take the
risk of our own.

Nathan




On 5/31/08, J Storrs Hall, PhD <[EMAIL PROTECTED]> wrote:
> Why do I believe anyone besides me is conscious? Because they are made of
> meat? No, it's because they claim to be conscious, and answer questions
> about
> their consciousness the same way I would, given my own conscious
> experience -- and they have the same capabilities, e.g. of introspection,
> 1-shot learning, synthesis of novel ideas, and access to episodic memory in
> narrative form (etc.) that I associate with being conscious myself.
>
> Build a machine that does *all* of these things and you have no better
> reason
> to claim it isn't conscious than you have to claim a person isn't.
>
> Josh
>
>
> -------------------------------------------
> agi
> Archives: http://www.listbox.com/member/archive/303/=now
> RSS Feed: http://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> http://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=103754539-40ed26
Powered by Listbox: http://www.listbox.com

Reply via email to