RE: computer pain

2006-12-17 Thread Stathis Papaioannou
Colin Hales writes: Stathis said SNIP and Colin has said that he does not believe that philosophical zombies can exist. Hence, he has to show not only that the computer model will lack the 1st person experience, but also lack the 3rd person observable behaviour of the real

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
Stahis said: snip If you present an object with identical sensory measurements but get different results in the chip, then that means what you took as sensory measurements was incomplete. For example, blind people might be able to sense the presense of someone who silently walks into the room

Re: computer pain

2006-12-17 Thread Brent Meeker
James N Rose wrote: Brent Meeker wrote: If consciousness is the creation of an inner narrative to be stored in long-term memory then there are levels of consciousness. The amoeba forms no memories and so is not conscious at all. A dog forms memories and even has some understanding

RE: computer pain

2006-12-17 Thread Stathis Papaioannou
perceive anything. Writers, philosophers, mathematicians can all be creative without perceiving anything. Stathis Papaioannou Date: Mon, 18 Dec 2006 10:54:05 +1100 From: [EMAIL PROTECTED] Subject: RE: computer pain To: everything-list@googlegroups.com

Re: computer pain

2006-12-17 Thread 1Z
Colin Geoffrey Hales wrote: What I expect to happen is that the field configuration I find emerging in the guts of the chips will be different, depending on the object, even though the sensory measurement is identical. The different field configurations will correspond to the different

RE: computer pain

2006-12-17 Thread Colin Geoffrey Hales
Colin, You have described a way in which our perception may be more than can be explained by the sense data. However, how does this explain the response to novelty? I can come up with a plan or theory to deal with a novel situation if it is simply described to me. I don't have to

Re: computer pain

2006-12-17 Thread James N Rose
Brent Meeker wrote: That notion may fit comfortably with your presumptive ideas about 'memory' -- computer stored, special-neuron stored, and similar. But the universe IS ITSELF 'memory storage' from the start. Operational rules of performance -- the laws of nature, so to speak --

Re: computer pain

2006-12-16 Thread Brent Meeker
Colin Geoffrey Hales wrote: So your theory is that the electromagnetic field has an ability to learn which is not reflected in QED - it's some hitherto unknown aspect of the field and it doesn't show up in the field violating Maxwell's equations or QED predictions? And further this aspect

RE: computer pain

2006-12-16 Thread Stathis Papaioannou
Colin Hales writes: Stathis wrote: I can understand that, for example, a computer simulation of a storm is not a storm, because only a storm is a storm and will get you wet. But perhaps counterintuitively, a model of a brain can be closer to the real thing than a model of a storm. We don't

Re: computer pain

2006-12-16 Thread Colin Geoffrey Hales
So the EM fields account for the experiences that accompany the brain processes. A kind of epiphenomena. So why don't my experiences change when I'm in an MRI? I haven't been through the detail - I hope to verify this in my simulations to come but... As far as I am aware MRI magnets

RE: computer pain

2006-12-16 Thread Colin Geoffrey Hales
I understand your conclusion, that a model of a brain won't be able to handle novelty like a real brain, but I am trying to understand the nuts and bolts of how the model is going to fail. For example, you can say that perpetual motion machines are impossible because they disobey the

RE: computer pain

2006-12-16 Thread Stathis Papaioannou
Colin Hales writes: I understand your conclusion, that a model of a brain won't be able to handle novelty like a real brain, but I am trying to understand the nuts and bolts of how the model is going to fail. For example, you can say that perpetual motion machines are impossible

Re: computer pain

2006-12-16 Thread Brent Meeker
Stathis Papaioannou wrote: Colin Hales writes: I understand your conclusion, that a model of a brain won't be able to handle novelty like a real brain, but I am trying to understand the nuts and bolts of how the model is going to fail. For example, you can say that perpetual motion

RE: computer pain

2006-12-15 Thread Colin Geoffrey Hales
So you are saying the special something which causes consciousness and which functionalism has ignored is the electric field around the neuron/astrocyte. But electric fields were well understood even a hundred years ago, weren't they? Why couldn't a neuron be simulated by something like a

RE: computer pain

2006-12-15 Thread Stathis Papaioannou
Colin, I can understand that, for example, a computer simulation of a storm is not a storm, because only a storm is a storm and will get you wet. But perhaps counterintuitively, a model of a brain can be closer to the real thing than a model of a storm. We don't normally see inside a

Re: computer pain

2006-12-15 Thread Brent Meeker
Colin Geoffrey Hales wrote: So you are saying the special something which causes consciousness and which functionalism has ignored is the electric field around the neuron/astrocyte. But electric fields were well understood even a hundred years ago, weren't they? Why couldn't a neuron be

Re: computer pain

2006-12-15 Thread Colin Geoffrey Hales
Brent said: snip Of course they describe things - they aren't the things themselves. But the question is whether the description is complete. Is there anything about EM fields that is not described by QED? Absolutely HEAPS! Everything that they are made of and how the components inteact to

RE: computer pain

2006-12-15 Thread Colin Geoffrey Hales
Stathis wrote: I can understand that, for example, a computer simulation of a storm is not a storm, because only a storm is a storm and will get you wet. But perhaps counterintuitively, a model of a brain can be closer to the real thing than a model of a storm. We don't normally see inside a

Re: computer pain

2006-12-15 Thread Brent Meeker
Colin Geoffrey Hales wrote: Stathis wrote: I can understand that, for example, a computer simulation of a storm is not a storm, because only a storm is a storm and will get you wet. But perhaps counterintuitively, a model of a brain can be closer to the real thing than a model of a storm. We

Re: computer pain

2006-12-15 Thread Colin Geoffrey Hales
So your theory is that the electromagnetic field has an ability to learn which is not reflected in QED - it's some hitherto unknown aspect of the field and it doesn't show up in the field violating Maxwell's equations or QED predictions? And further this aspect of the EM field is able to

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
Jamie Rose writes: Stathis, As I was reading your comments this morning, an example crossed my mind that might fit your description of in-place code lines that monitor 'disfunction' and exist in-situ as a 'pain' alert .. that would be error evaluating 'check-sum' computations. In a

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
Brent meeker writes: Stathis Papaioannou wrote: Brent Meeker writes: I would say that many complex mechanical systems react to pain in a way similar to simple animals. For example, aircraft have automatic shut downs and fire extinguishers. They can change the flight controls

RE: computer pain

2006-12-14 Thread Stathis Papaioannou
: RE: computer pain To: everything-list@googlegroups.com Hi Stathis/Jamie et al. I've been busy else where in self-preservation mode deleting emails madly .frustrating, with so many threads left hanging...oh well...but I couldn't go past this particular dialog. I am having trouble

Re: computer pain

2006-12-14 Thread James N Rose
Yes Stathis, you are right, 'noxious stimulus' and 'experience' are indeed separable - but - if you want to do an analysis of comparing, its important to identify global parameters and potential analogs. My last post's example tried to address those components. I've seen stress diagrams of

RE: computer pain

2006-12-14 Thread Colin Geoffrey Hales
no clue where it was and learn nothing looking remotely normal. Meanwhile Marvin inside can do perfectly good 'zombie room' science. RE: Computer Pain There's a whole axis of modelling orthogonal to the soma membrane which gets statistically abstracted out by traditional Hodkin/Huxley models. The neuron

Re: computer pain

2006-12-14 Thread Brent Meeker
Stathis Papaioannou wrote: Brent meeker writes: Stathis Papaioannou wrote: Brent Meeker writes: I would say that many complex mechanical systems react to pain in a way similar to simple animals. For example, aircraft have automatic shut downs and fire extinguishers. They can change

Re: computer pain

2006-12-13 Thread Brent Meeker
James N Rose wrote: Stathis, The reason for lack of responses is that your idea goes directly to illuminating why AI systems - as promoulgated under current designs of software running in hardware matrices - CANNOT emulate living systems. It an issue that AI advocates intuitively and

RE: computer pain

2006-12-13 Thread Stathis Papaioannou
: Re: computer pain Stathis, The reason for lack of responses is that your idea goes directly to illuminating why AI systems - as promoulgated under current designs of software running in hardware matrices - CANNOT emulate living systems. It an issue that AI advocates intuitively

RE: computer pain

2006-12-13 Thread Stathis Papaioannou
Brent Meeker writes: I would say that many complex mechanical systems react to pain in a way similar to simple animals. For example, aircraft have automatic shut downs and fire extinguishers. They can change the flight controls to reduce stress on structures. Whether they feel this

Re: computer pain

2006-12-13 Thread James N Rose
Stathis, As I was reading your comments this morning, an example crossed my mind that might fit your description of in-place code lines that monitor 'disfunction' and exist in-situ as a 'pain' alert .. that would be error evaluating 'check-sum' computations. In a functional way, parallel

Re: computer pain

2006-12-13 Thread Brent Meeker
Stathis Papaioannou wrote: Brent Meeker writes: I would say that many complex mechanical systems react to pain in a way similar to simple animals. For example, aircraft have automatic shut downs and fire extinguishers. They can change the flight controls to reduce stress on

RE: computer pain

2006-12-13 Thread Colin Geoffrey Hales
Hi Stathis/Jamie et al. I've been busy else where in self-preservation mode deleting emails madly .frustrating, with so many threads left hanging...oh well...but I couldn't go past this particular dialog. I am having trouble that you actually believe the below to be the case! Lines of

RE: computer pain

2006-12-12 Thread Stathis Papaioannou
No responses yet to this question. It seems to me a straightforward consequence of computationalism that we should be able to write a program which, when run, will experience pain, and I suspect that this would be a substantially simpler program than one demonstrating general intelligence. It

Re: computer pain

2006-12-12 Thread James N Rose
Stathis, The reason for lack of responses is that your idea goes directly to illuminating why AI systems - as promoulgated under current designs of software running in hardware matrices - CANNOT emulate living systems. It an issue that AI advocates intuitively and scrupulously AVOID. Pain in

<    1   2