Brent Meeker writes:
> Brent Meeker writes:
>> >> Pain is limited on both ends: on the input by damage to the
>> physical >> circuitry and on the response by the possible range of
>> > > Responses in the brain are limited by several mechanisms, such as
>> > exhaustion of neurotransmitter stores at synapses, negative feedback
>> > mechanisms such as downregulation of receptors, and, I suppose, the
>> > total numbers of neurons that can be stimulated. That would not be a
>> > problem in a simulation, if you were not concerned with modelling
>> the > behaviour of a real brain. Just as you could build a structure
>> 100km > tall as easily as one 100m tall by altering a few parameters
>> in an > engineering program, so it should be possible to create
>> unimaginable > pain or pleasure in a conscious AI program by changing
>> a few parameters.
>> I don't think so. It's one thing to identify functional equivalents
>> as 'pain' and 'pleasure'; it's something else to claim they have the
>> same scaling. I can't think of anyway to establish an invariant
>> scaling that would apply equally to biological, evolve creatures and
>> to robots.
> Take a robot with pain receptors. The receptors take temperature and
> convert it to a voltage or current, which then goes to an analogue to
> digital converter, which inputs a binary number into the robot's central
> computer, which then experiences pleasant warmth or terrible burning
> depending on what that number is. Now, any temperature transducer is
> going to saturate at some point, limiting the maximal amount of pain,
> but what if you bypass the transducer and the AD converter and input the
> pain data directly into the computer? Sure, there may be software limits
> specifying an upper bound to the pain input (eg, if x>100 then input
> 100), but what theoretical impediment would there be to changing this?
> You would have to show that pain or pleasure beyond a certain limit is
No. I speculated that pain and pleasure are functionally defined. So there could be a functionally defined limit. Just because you can put in a bigger representation of a number, it doesn't follow that the functional equivalent of pain is linear in this number and doesn't saturate.
Pain and pleasure have a function in naturally evolved entities, but I am not
sure if you mean something beyond this by "functionally defined". Digging a
hole involves physically moving quantities of dirt, and a simulation of the
processes taking place in the processor of a hole-digging robot will not actually
move any dirt. However, if the robot is conscious (and a sufficiently sophisticated
hole-digging robot may be) then the simulation should reproduce, from its point
of view, the experience. Moreover, with a little tweaking it should be possible to
give it the experience of digging a hole all the way to the centre of the Earth, even
though in reality it would be impossible to do such a thing. I don't think it would be
reasonable to say that the virtual experience would be limited by the physical reality.
Even if there is something about the robot's hardware which prevents it from experiencing
the digging of holes beyond a certain depth because there is no need for it surely it would
just be a minor technical problem to remove such a limit.
You could speculate that the experience of digging holes involves the dirt, the shovel, robot
sensors and effectors, the power supply as well as the central processor, which would mean
that virtual reality by playing with just the central processor is impossible. This is perhaps
what Colin Hales has been arguing, and is contrary to computationalism.
Be one of the first to try Windows Live Mail.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at