On Thu, Aug 1, 2019 at 6:34 PM 'Brent Meeker' via Everything List <
[email protected]> wrote:

>
>
> On 8/1/2019 9:41 AM, Jason Resch wrote:
>
>
>
> On Tue, Jul 30, 2019 at 1:08 AM 'Brent Meeker' via Everything List <
> [email protected]> wrote:
>
>>
>>
>> On 7/29/2019 2:14 PM, Jason Resch wrote:
>>
>> That is not a well-defined procedure -- too many ambiguities remain.
>>>
>>>
>>
>> What is ambiguous? One physical state "your brain" is adjusted gradually
>> until it is equal to the brain of someone else.  You agree this is
>> physically possible,  right?
>>
>>
>> No, it's not possible if the conscious processes of the brain are
>> digital.  Then there is a lower bound to "gradualness".
>>
>
> If it is digital then wouldn't it already be "discontinuous" by nature?
> Are you suggesting personal identity can or should be defined by mapping an
> identity to a particular digital program?
>
>
>> And what if the two brains have different numbers of neurons and
>> different numbers of connections...how can you map one to the other?
>>
>
> No problem. New neurons grow and die all the time in our own brains.
>
>
> By that argument there's no reason to exchange A's brain with B's brain.
> They'll just grow new neurons anyway (although it's not clear that mature
> brains grow very many new neurons).
>
> My point is that if A's brain has five neurons and B's brain has seven
> neurons, then you can't "gradually" transfer A's connectivity to B's
> brain.  There are going to be unconnected neurons at some point.
>

Missing neurons can be virtually represented as extant neurons which have
no connections.  A net with 5 neurons can be  considered a net of 7 neurons
where 2 neurons are not connected to anything else.  Likewise an extra
neuron can be removed by decreasing the weight of the connections until
they are 0 for everything they are wired to.

In this way, any neural net can be morphed into any other simply by
adjusting one "connection between any two neurons" at a time.  Is any one
of these adjustments alone enough to destroy conscious brain activity?
Perhaps it is and you go into a coma for a few months, but the problem for
bodily continuity theories of personal identity is that you had one body
and brain this whole time, and now you are identical with someone quite
different.

It is like Saibal Mitra said, the person he was when he was 3 is dead.  Too
much information was added to his brain.  If his 3 year old self were
suddenly replaced with his much older self, you would conclude the 3 year
old was destroyed, but when gradual changes are made, day by day,
common-sense and convention maintains that the 3-year-old was not
destroyed, and still lives.  This is the inconsistency of continuity
theories.

You get different answers for physically identical situations, depending on
the history/path to getting there.  No where in physics does the history of
some particle affect how it behaves.  Yet this is what continuity theories
of personal identity demand: in one situation concluding a person is dead,
while in another, concluding they are alive, depending on the history of
the particles constituting that person.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CA%2BBCJUh9BQb2BX_MSsAC8Fg-bVv%3DgmVt%3DEy51SZhmjTfuTPkZg%40mail.gmail.com.

Reply via email to