On Sep 17, 2012, at 3:39 PM, Craig Weinberg <whatsons...@gmail.com> wrote:



On Monday, September 17, 2012 9:24:23 AM UTC-4, stathisp wrote:


On Sep 16, 2012, at 10:42 PM, Craig Weinberg <whats...@gmail.com> wrote:

Moreover, this
set has subsets, and we can limit our discussion to these subsets. For
example, if we are interested only in mass, we can simulate a human
perfectly using the right number of rocks. Even someone who believes
in an immortal soul would agree with this.

No, I don't agree with it at all. You are eating the menu. A quantity of mass doesn't simulate anything except in your mind. Mass is a normative abstraction which we apply in comparing physical bodies with each other. To reduce a human being to a physical body is not a simulation is it only weighing a bag of organic molecules.

I'm just saying that the mass of the human and the mass of the rocks is the same, not that the rocks and the human are the same. They share a property, which manifests as identical behaviour when they are put on scales. What's controversial about that?

It isn't controversial, but I am suggesting that maybe it should be. It isn't that there is an independent and disembodied 'property' that human body and the rocks share, it is that we measure them in a way which allows us to categorize one's behavior as similar to another in a particular way.

Think of the fabric of the universe being like an optical illusion where colors change when they are adjacent to each other but not if they are against grey. There is no abstract property being manifested as concrete experiences, only concrete experiences can be re-presented as abstract properties.


Yes, but there are properties of the brain that may not be relevant to
behaviour. Which properties are in fact important is determined by
experiment. For example, we may replace the myelin sheath with a
synthetic material that has similar electrical properties and then
test an isolated nerve to see if action potentials propagate in the
same way. If they do, then the next step is to incorporate the nerve
in a network and see if the pattern of firing in the network looks
normal. The step after that is to replace the myelin in the brain of a
rat to see if the animal's behaviour changes. The modified rats are
compared to unmodified rats by a blinded researcher to see if he can
tell the difference. If no-one can consistently tell the difference
then it is announced that the synthetic myelin appears to be a
functionally identical substitute for natural myelin.

Except it isn't identical. No imitation substance is identical to the original. Sooner or later the limits of the imitation will be found - or they could be advantages. Maybe the imitation myelin prevents brain cancer or heat stroke or something, but it also maybe prevents sensation in cold weather or maybe certain amino acids now cause Parkinson's disease. There is no such thing as identical. There is only 'seems identical from this measure at this time'.

Yes, it's not *identical*. No-one has claimed this. And since it's not identical, under some possible test it would behave differently; otherwise it would be identical.

Not in the case of consciousness. There is no reason to believe that it is possible to test quality of consciousness. What might seem identical to a child may be completely dysfunctional as an adolescent - or it might be that tests done in a laboratory fail to reveal real world defects. We have no reason to believe that it is possible for consciousness to be anything other than completely unique and maybe even tied to the place and time of its instantiation.


But there are some changes which make no functional difference.

Absolutely, but consciousness is not necessarily a function, and function is subject to the form of measurement and interpretation applied.

If l have a drink of water, that changes my brain by decreasing the sodium concentration. But this change is not significant if we are considering whether I continue to manifest normal human behaviour, since firstly the brain is tolerant of moderate physical changes

But a few milligrams of LSD or ricin (LD100 of 25 µg/kg) will have a catastrophic effect on normal human capacities, so that the brain's tolerance has nothing to do with how moderate the physical changes are. That's a blanket generalization that doesn't pan out. It's folk neuroscience.

and secondly people can manifest a range of different behaviours and remain recognisably human and recognisably the same human. In other words humans have certain engineering tolerances in their components, and the aim in replacing components would be to do it within this tolerance. Perfection is not attainable by either engineers or nature.

Engineering may not be applicable to consciousness though. There is tolerance for the extension of consciousness - if you injure your spine, we could engineer a new segment, just like we could replace your leg with a prosthetic, but there is not necessarily a replacement for the self as a whole. A prosthetic head that doesn't replace the person is not necessarily a possibility. You assume that a person is a structure with interchangeable parts. I think it is an experience which is inherently irreducible and non-transferable.



As is the nature
of science, another team of researchers may then find some deficit in
the behaviour of the modified rats under conditions the first team did
not examine. Scientists then make modifications to the formula of the
synthetic myelin and do the experiments again.

Which is great for medicine (although ultimately maybe unsustainably expensive), but it has nothing to do with the assumption of identical structure and the hard problem of consciousness. There is no such thing as identical experience. I have suggested that in fact we can perhaps define consciousness as that which has never been repeated. It is the antithesis of that which can be repeated, (hence the experience of "now"), even though experiences themselves can seem very repetitive. The only seem so from the vantage point of a completely novel moment of consideration of the memories of previous iterations.

Here is where you have misunderstood the whole aim of the thought experiment in the paper you have cited. The paper assumes that identical function does *not* necessarily result in identical consciousness and follows this idea to see where it leads.

I understand that, but it still assumes that there is a such thing as a set of functions which could be identified and reproduced that cause consciousness. I don't assume that, because consciousness isn't like anything else. It is the source of all functions and appearances, not the effect of them. Once you have consciousness in the universe, then it can be enhanced and altered in infinite ways, but none of them can replace the experience that is your own.


Craig,

Do you think if your brain were cut in half, but then perfectly put back together that you would still be conscious in the same way?

What if cut into a thousand pieces and put back together perfectly?

What if every atom was taken apart and put back together?

What if every atom was taken apart, and then atoms from a different pile were used to put you back together?

What then if the original atoms were put back, would they both experience what it is like to be you?

Does the identity of one's atoms matter or are they interchangable? If the identity is not what matters, what is it that does?

Jason


> This is the point of the thought experiment. The limitations of all forms of > measurement and perception preclude all possibility of there ever being a > such thing as an exhaustively complete set of third person behaviors of any
> system.
>
> What is it that you don't think I understand?

What you don't understand is that an exhaustively complete set of
behaviours is not required.

Yes, it is. Not for prosthetic enhancements, or repairs to a nervous system, but to replace a nervous system without replacing the person who is using it, yes, there is no set of behaviors which can ever be exhaustive enough in theory to accomplish that. You might be able to do it biologically, but there is no reason to trust it unless and until someone can be walked off of their brain for a few weeks or months and then walked back on.

The replacement components need only be within the engineering tolerance of the nervous system components. This is a difficult task but it is achievable in principle.

You assume that consciousness can be replaced, but I understand exactly why it can't. You can believe that there is no difference between scooping out your brain stem and replacing it with a functional equivalent as long as it was well engineered, but to me it's a completely misguided notion. Consciousness doesn't exist on the outside of us. Engineering only deals with exteriors. If the universe were designed by engineers, there could be no consciousness.


I don't access an exhaustively complete
set of behaviours to determine if my friends are the same people from
day to day, and in fact they are *not* the same systems from day to
day, as they change both physically and psychologically. I have in
mind a rather vague set of behavioural behavioural limits and if the
people who I think are my friends deviate significantly from these
limits I will start to worry.

Which is exactly why you would not want to replace your friends with devices capable only of programmed deviations. Are simulated friends 'good enough'. Will it be good enough when your friends convince you to be replaced by your simulation?

I assume that my friends have not been replaced by robots. If they have been then that means the robots can almost perfectly replicate their behaviour, since I (and people in general) am very good at picking up even tiny deviations from normal behaviour. The question then is, if the function of a human can be replicated this closely by a machine does that mean the consciousness can also be replicated? The answer is yes, since otherwise we would have the possibility of a person having radically different experiences but behaving normally and being unaware that their experiences were different.

The answer is no. A cartoon of Bugs Bunny has no experiences but behaves just like Bugs Bunny would if he had experiences. You are eating the menu.

Craig



-- Stathis Papaioannou
--
You received this message because you are subscribed to the Google Groups "Everything List" group. To view this discussion on the web visit https://groups.google.com/d/msg/everything-list/-/1JuM_HGXyUoJ .
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com . For more options, visit this group at http://groups.google.com/group/everything-list?hl=en .

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to