I basically agree with your appraisal of the differences between living
brains and digital computers. However, it should be possible for a general
purpose computer to emulate the behaviour of a biological system in software.
After all, biological systems are just comprised of matter following the laws
physics, which are well understood and deterministic at the size scales of
When it comes to neural tissue, the emulation should be able to replace the
provided that it is run on sufficiently fast hardware and has appropriate
input and output.
While it would be extremely difficult to emulate a particular human brain (as
"mind uploading"), it should be easier to emulate a simplified generic brain,
again to emulate a single simplified perceptual function, such as pain. This
it should be possible to store on a hard disk lines of code which, when run on
will result in the program experiencing pain; perhaps excruciating pain beyond
humans can imagine, if certain parameters in the program are appropriately
What might a simple example of such code look like? Should we try to determine
the painful programs are as a matter of urgency, in order to avoid using them
subroutines in other programs?
> Date: Tue, 12 Dec 2006 23:19:05 -0800
> From: [EMAIL PROTECTED]
> To: firstname.lastname@example.org
> Subject: Re: computer pain
> The reason for lack of responses is that your idea
> goes directly to illuminating why AI systems - as
> promoulgated under current designs of software
> running in hardware matrices - CANNOT emulate living
> systems. It an issue that AI advocates intuitively
> and scrupulously AVOID.
> "Pain" in living systems isn't just a self-sensor
> of proper/improper code functioning, it is an embedded
> registration of viable/disrupted matrix state.
> And that is something that no current human contrived
> system monitors as a CONCURRENT property of software.
> For example, we might say that central processors
> regularly 'display pain' .. that we designers/users
> recognize as excess heat .. that burn out mother boards.
> The equipment 'runs a high fever', in other words.
> But where living systems are multiple functioning systems
> and have internal ways of guaging and reacting locally and
> biochemically vis a vis both to the variance and retaining
> sufficient good-operations while bleeding off 'fever',
> "hardware" systems have no capacity to morph or adapt
> itself structurally and so keep on burning up or wait
> for external aware-structures to command them to stop
> operating for a while and let the equipment cool down.
> I maintain that living systems are significantly designed where
> hardware IS software, and so have a capacity for local
> adaptive self-sensitivity, that human 'contrived' HW/SW systems
> don't and mostly .. can't.
> Jamie Rose
> Stathis Papaioannou wrote:
> > No responses yet to this question. It seems to me a straightforward
> > consequence of computationalism that we should be able to write a program
> > which, when run, will experience pain, and I suspect that this would be a
> > substantially simpler program than one demonstrating general intelligence.
> > It
> > would be very easy to program a computer or build a robot that would behave
> > just like a living organism in pain, but I'm not sure that this is nearly
> > enough to
> > ensure that it is in fact experiencing pain. Any ideas, or references to
> > sources
> > that have considered the problem?
> > Stathis Papaioannou
Be one of the first to try Windows Live Mail.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at