On Tuesday, October 8, 2013 10:10:25 AM UTC-4, Jason wrote:
> On Sun, Oct 6, 2013 at 3:00 PM, Craig Weinberg 
> <whats...@gmail.com<javascript:>
> > wrote:
>> On Sunday, October 6, 2013 5:06:31 AM UTC-4, Bruno Marchal wrote:
>>> On 06 Oct 2013, at 03:17, Stathis Papaioannou wrote: 
>>> > On 5 October 2013 00:40, Bruno Marchal <mar...@ulb.ac.be> wrote: 
>>> > 
>>> >>> The argument is simply summarised thus: it is impossible even for   
>>> >>> God 
>>> >>> to make a brain prosthesis that reproduces the I/O behaviour but has 
>>> >>> different qualia. This is a proof of comp, 
>>> >> 
>>> >> 
>>> >> Hmm... I can agree, but eventually no God can make such a   
>>> >> prothesis, only 
>>> >> because the qualia is an attribute of the "immaterial person", and   
>>> >> not of 
>>> >> the brain, body, or computer.  Then the prosthesis will manifest   
>>> >> the person 
>>> >> if it emulates the correct level. 
>>> > 
>>> > But if the qualia are attributed to the substance of the physical 
>>> > brain then where is the problem making a prosthesis that replicates 
>>> > the behaviour but not the qualia? 
>>> > The problem is that it would allow 
>>> > one to make a partial zombie, which I think is absurd. Therefore, the 
>>> > qualia cannot be attributed to the substance of the physical brain. 
>>> I agree. 
>>> Note that in that case the qualia is no more attributed to an   
>>> immaterial person, but to a piece of primary matter. 
>>> In that case, both comp and functionalism (in your sense, not in   
>>> Putnam's usual sense of functionalism which is a particular case of   
>>> comp) are wrong. 
>>> Then, it is almost obvious that an immaterial being cannot distinguish   
>>> between a primarily material incarnation, and an immaterial one, as it   
>>> would need some magic (non Turing emulable) ability to make the   
>>> difference.  People agreeing with this do no more need the UDA step 8   
>>> (which is an attempt to make this more rigorous or clear). 
>>> I might criticize, as a devil's advocate, a little bit the partial- 
>>> zombie argument. Very often some people pretend that they feel less   
>>> conscious after some drink of vodka, but that they are still able to   
>>> behave normally. Of course those people are notoriously wrong. It is   
>>> just that alcohol augments a fake self-confidence feeling, which   
>>> typically is not verified (in the laboratory, or more sadly on the   
>>> roads). Also, they confuse "less conscious" with "blurred   
>>> consciousness",
>> Why wouldn't less consciousness have the effect of seeming blurred? If 
>> your battery is dying in a device, the device might begin to fail in 
>> numerous ways, but those are all symptoms of the battery dying - of the 
>> device becoming less reliable as different parts are unavailable at 
>> different times.
>> Think of qualia as a character in a long story, which is divided into 
>> episodes. If, for instance, someone starts watching a show like Breaking 
>> Bad only in the last season, they have no explicit understanding of who 
>> Walter White is or why he behaves like he does, where Jesse came from, etc. 
>> They can only pick up what is presented directly in that episode, so his 
>> character is relatively flat. The difference between the appreciation of 
>> the last episode by someone who has seen the entire series on HDTV and 
>> someone who has only read the closed captioning of the last episode on 
>> Twitter is like the difference between a human being's qualia and the 
>> qualia which is available through a logical imitation of a human bring. 
>> Qualia is experience which contains the felt relation to all other 
>> experiences; specific experiences which directly relate, and extended 
>> experiential contexts which extent to eternity (totality of manifested 
>> events so far relative to the participant plus semi-potential events which 
>> relate to higher octaves of their participation...the bigger picture with 
>> the larger now.) 
> Craig,
> I agree with you that there is some "building up" required to create a 
> full and rich human experience, which cannot happen in a single instance or 
> with a single CPU instruction being executed. However, where I disagree 
> with you is in how long it takes for all the particulars of the experience 
> to be generated from the computation.  I don't think it requires 
> re-calculating the entire history of the human race, or life itself on some 
> planet.  I think it can be done by comparing relations to memories and data 
> stored entirely within the brain itself; say within 0.1 to 0.5 seconds of 
> computation by the brain, not the eons of life's evolution.

That could be true in theory but it does not seem to be supported by 
nature. In reality, there is no way to watch a movie in less time than the 
movie takes to be watched without sacrificing some qualities of the 
experience. Experience is nothing like data, as data is compressible since 
it has no qualitative content to be lost in translation. In all cases, 
calculation is used to eliminate experience - to automate and anesthetize. 
It cannot imitate experience, any more than an HSV coordinate can look like 
a particular color. 

We cannot separate the experience of a living person from their history 
going back to the beginning of time any more than we can take out the last 
chapter of War and Peace and glue it to the end of a manual for a digital 
thermostat. It might look like a book to us, but the story doesn't make 
sense. We can't sum up the meaning of the absent chapters without having 
ourselves lived in a world that Tolstoy once lived in. That world is not 
accessible to digits or thermostats.

>So perhaps we are on the same page, but merely disagree on how detailed 
the details of the computation need to be.  i.e., what is the substitution 
layer (atomic interactions, or the history of atomic interactions on a 
global scale?).

That would make sense if computation were the same as experience rather 
than the representation of its opposite. In reality though, I think that 
substitution is an illusion of computation itself. There is no such thing 
as emulation or approximation. Awareness is absolute non-emulable because 
it is the manifestation of uniqueness itself.

Post today relates: 



You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to