Hi Stathis,

> This seems to be getting away from the simple requirement that the
> computer be able to handle counterfactuals. What if the device were
> not easy to disarm, but almost impossible to disarm? What if it had
> tentacles in every neurone, ready to destroy it if it fired at the
> wrong time?

I do think you have a point there. I began by equating counterfactual
structure with cause/effect structure, but now am drifting away from
that... So, can I make the point purely talking about causality?

I still think the answer may be yes...

The causal structure of a recording still looks far different from the
causal structure of a person that happens to follow a recording and
also happens to be wired to a machine that will kill them if they
deviate. Or, even, correct them if they deviate. (Let's go with that
so that I can't point out the simplistic difference "a recording will
not die if some external force causes it to deviate".)

1. Realistic malfunctions of a machine playing a recording are far
different from realistic malfunctions of the person-machine-combo. The
person inherits the possible malfunctions of the machine, *plus*
malfunctions in which the machine fails to modify the person's
behavior to match the recording. (A malfunction can be defined in
terms of cause-effect counterfactuals in two ways: first, if we think
that cause/effect is somewhat probabilistic, we will think that any
machine will occasionally malfunction; second, varying external
factors can cause malfunctions.)

2. Even during normal functioning, the cause/effect structure is very
different; the person-combo will have a lot of extra structure, since
it has a functioning brain and a corrective mechanism, neither needed
for the recording.

Also-- the level of the correction matters quite a bit I think. If
only muscle actions are being corrected, the person seems obviously
conscious-- lots of computations (& corresponding causal structure) is
still going on.. If each neuron is corrected, this is not so
intuitively obvious. (I suppose my intuition says that the person
would lose consciousness when the first correction occurred, though
that is silly upon reflection.)

How does that sound?

--Abram

On Fri, Dec 5, 2008 at 7:58 PM, Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
>
> 2008/12/6 Abram Demski <[EMAIL PROTECTED]>:
>>
>> Stathis,
>>
>> I think I can get around your objection by pointing out that the
>> structure of counterfactuals is quite different for a recording vs. a
>> full human who is wired to be killed if they deviate from a recording.
>> Someone could fairly easily disarm the killing device, whereas it
>> would be quite difficult to reconstruct the person from the recording
>> (in fact there is not enough information to do so).
>
> This seems to be getting away from the simple requirement that the
> computer be able to handle counterfactuals. What if the device were
> not easy to disarm, but almost impossible to disarm? What if it had
> tentacles in every neurone, ready to destroy it if it fired at the
> wrong time?
>
>> A related way out would be to point out that all the computational
>> machinery is present in one case (merely disabled), whereas it is
>> totally absent in the other case.
>
> So you agree that in the case where the extra machinery is waiting to
> be dropped into place, consciousness results?
>
>
> --
> Stathis Papaioannou
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to [EMAIL PROTECTED]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to