Hi Stathis/Jamie et al.
I've been busy else where in self-preservation mode ....deleting emails
madly .....frustrating, with so many threads left hanging...oh well...but
I couldn't go past this particular dialog.

I am having trouble that you actually believe the below to be the case!
Lines of code that experience pain? Upon what law of physics is that

Which one hurts more:

if (INPUT A) >= '1' then {
if (INPUT A) >= '1' then {
if (INPUT A) >= '10' then {

Also: In a distributed application....If I put the program on earth, the
input on Mars and the CPU on the moon, which bit actually does the hurting
and when? It's still a program, still running - functionally the same
(time delays I know - not quite the same... but you get the idea).

The idea is predicated on the proven non-existance of a physical mechanism
for experience - that it somehow equates with manipulation of abstract
symbols as information rather than the fabric of reality as information. -
That pretending to be a neuron necessarily results in everything that a
neuron participates in as a chunk of matter.

It also completely ignores the ROLE of the experiences. There's a reason
for them. Unless you know the role you cannot assume that the software
model will inherit that role. With no role why bother with it? I don't
have to put "OUCH!""OUCH!""OUCH!" in the above.

What you are talking about is 'strong-AI' --- its functionalist
assumptions need to be carefully considered.

Another issue: If a life-like artefact visibly behaves like it is in agony
the only thing actually getting hurt are the humans watching it, who have
real experiences and empathy based on real qualia. It might be OK if it
were play. But otherwise? hmmmm.



> Jamie,
> I basically agree with your appraisal of the differences
> between living brains and digital computers. However, it
> should be possible for a general purpose computer to
> emulate the behaviour of a biological system in
> software. After all, biological systems are just
> comprised of matter following the laws of
> physics, which are well understood and deterministic
> at the size scales of interest.

> When it comes to neural tissue, the emulation should be
> able to replace the original provided that it is run on
> sufficiently fast hardware and has appropriate
> interfaces for input and output.
> While it would be extremely difficult to emulate a
> particular human brain (as in "mind uploading"), it should
> be easier to emulate a simplified generic brain, and easier
> again to emulate a single simplified perceptual function,
> such as pain.This means that it should be possible to store
> on a hard disk lines of code which, when
> run on a PC, will result in the program experiencing pain;
> perhaps excruciating pain beyond what
> humans can imagine, if certain parameters in the program
> are appropriately chosen. What might a simple example of
> such code look like? Should we try to determine what
> the painful programs are as a matter of urgency,
> in order to avoid using them in
> subroutines in other programs?
> Stathis Papaioannou

 You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 

Reply via email to