2009/4/23 Brent Meeker <meeke...@dslextreme.com>:
>> Say a machine is in two separate parts M1 and M2, and the information
>> on M1 in state A is written to a punchcard, walked over to M2, loaded,
>> and M2 goes into state B. Then what you are suggesting is that this
>> sequence could give rise to a few moments of consciousness, since A
>> and B are causally connected; whereas if M1 and M2 simply went into
>> the same respective states A and B at random, this would not give rise
>> to the same consciousness, since the states would not have the right
>> causal connection. Right?
> Maybe. But I'm questioning more than the lack of causal connection.
> I'm questioning the idea that a static thing like a state can be
> conscious. That consciousness goes through a set of states, each one
> being an "instant", is an inference we make in analogy with how we would
> write a program simulating a mind. I'm saying I suspect something
> essential is missing when we "digitize" it in this way. Note that this
> does not mean I'd say "No" to Burno's doctor - because the doctor is
> proposing to replace part of my brain with a mechanism that instantiates
> a process - not just discrete states.
What is needed for the series of states to qualify as a process? I
assume that a causal connection between the states, as in my example
above, would be enough, since it is what happens in normal brains and
computers. But what would you say about the examples I give below,
where the causal connection is disrupted in various ways: is there a
process or is there just an unfeeling sequence of states?
>> But then you could come up with variations on this experiment where
>> the transfer of information doesn't happen in as straightforward a
>> manner. For example, what if the operator who walks over the punchcard
>> gets it mixed up in a filing cabinet full of all the possible
>> punchcards variations, and either (a) loads one of the cards into M2
>> because he gets a special vibe about it and it happens to be the right
>> one, or (b) loads all of the punchcards into M2 in turn so as to be
>> sure that the right one is among them? Would the machine be conscious
>> if the operator loads the right card knowingly, but not if he is just
>> lucky, and not if he is ignorant but systematic? If so, how could the
>> computation know about the psychological state of the operator?
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org
To unsubscribe from this group, send email to
For more options, visit this group at