On 13 May 2015 at 09:16, meekerdb <[email protected]> wrote: > On 5/12/2015 12:34 PM, Stathis Papaioannou wrote: > > > On 12 May 2015, at 10:37 am, Russell Standish <[email protected]> wrote: > > On Tue, May 12, 2015 at 10:23:31AM +1000, Stathis Papaioannou wrote: > > The final straw would have to be indivisible, otherwise you could make a > partial zombie by replacing half the straw. > > I disagree. The final straw either works, or does not work. If you > replace half the straw, then the resulting half-straw either works or > it doesn't, and that directly affects whether you have a conscious > entity or not. No need to invoke a partial zombie. > > It would lead to a strange form of computationalism: you could replace say > 40% of the brain without any problem, but go to 40.00000001% and > consciousness gies off. > > It's what happens in the real world all the time. One moment you have > a whole working network - the next you have pieces. Consider > dismantling an engine. 3 screws out, and the engine still idles. Take > the 4th out, and the head falls off. > > It could work that way with consciousness, and going from full consciousness > to full zombie would be a way to avoid the absurdity of partial zombies. But > it would have the following consequences: > > Physiologically, qualia do in fact fade in parallel with function as neurons > are destroyed, but if the neurons are replaced, this relationship between > function and qualia is overturned. The artificial neurons can sustain the > consciousness that would otherwise have been lost, so computationalism is > true to this extent, but they lose this capability at a certain threshold. > Two beings with partial brain replacements could differ only in the smallest > possible increment such as the position of an electron, but one is a zombie > and the other fully conscious. > > > I don't understand you reference to fading qualia. Chalmers argued that > there could not be fading qualia in a transistion from a biological brain to > a functionally identical silicon chip brain. That doesn't rule out qualia > fading as neurons or transistors are simply lost from the system. In that > case I think it quite likely that qualia will "fade". And "fading" may me > loss of some aspect of qualia. For example we could lose color vision if we > lost the cone receptors in our retinas.
I agree with your comment. My post was in response to Russell, who claimed that a way out of the "Fading Qualia" paper conclusion that computerised replacement of neurons would preserve consciousness was that the qualia could remain until some threshold of replacement was reached then suddenly disappear, causing a transition from full consciousness to full zombie. Bruno also suggested something like this. It is possible, but it would mean there is a decoupling between qualia and the underlying function, and it would also make for a kind of weird partial computationalism. -- Stathis Papaioannou -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/everything-list. For more options, visit https://groups.google.com/d/optout.

