On 7/1/2012 10:26 AM, John Clark wrote:
On Sun, Jul 1, 2012 Bruno Marchal <marc...@ulb.ac.be
>> There are incompatible from the "1-pov" ONLY if you assume there can
one Bruno Marchal
> "1-pov" means "1-pov" from the 1-pov view.
That's real nice, but the predictions written down in advance were:
1) I Bruno Marchal will write in my diary "I Bruno Marchal am now in Washington and only
2) I Bruno Marchal will write in my diary "I Bruno Marchal am now in Moscow and only
Without making silly assumptions like there can be only one Bruno Marchal show me how
these predictions were wrong from ANY perspective you care to name.
The difficulty seems to be that we tend to think of ourselves as unique and so this
produces contradictions with the idea we can be duplicated. Suppose we take a coin and on
the heads side we right "This side will come up." Then we flip the coin and half the time
the writing is correct and half the time it's wrong. Now a man comes along and says let
me explain this. He duplicates the coin; and now when you flip one of the coins he
catches it and puts it down one way and the other one beside it the opposite way (but you
can't tell which is which). And he says, see this is what is really happening and why the
probability is 1/2 - from the viewpoint of the original coin.
It might be objected that brains are different because they have a first-person
viewpoint. But why do they have it? Suppose that before the duplication of the man in
Helsinki, his brains is provided with a transceiver that can broadcast and receive signals
from his brain. Now when he is duplicated this brain in Washington is still connected to
his brain in Moscow, and he is in two places at once in terms of his perceptions. Does he
still have unity of consciousness? I think that he would if the connections were
sufficiently comprehensive. He might have one focus of conscious attention that could
shift between what he perceived in Moscow and what he perceived in Washington. Could he
have two? Can you notice what your toes are feeling at the same time you notice what your
fingers are feeling? I can't. So the 'unity of consciousness' seems to an inability to
process multiple sources of input at the same time and fit them into your inner
narrative. Your inner narrative assumes a single 'you'. In the connected brain example
it might be possible to learn a bi-local point of view. Certainly in terms of artificial
intelligence there would be no need to assume a single localized 'self'. Already military
defensive systems integrate many sensors from many locations.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at