On Oct 5, 7:10 pm, Stathis Papaioannou <stath...@gmail.com> wrote: > On Thu, Oct 6, 2011 at 2:44 AM, Craig Weinberg <whatsons...@gmail.com> wrote: > > On Oct 5, 10:15 am, Quentin Anciaux <allco...@gmail.com> wrote: > > >> No they are not saying that. They are saying that a model of the brain fed > >> with the same inputs as a real brain will act as the real brain... if it > >> was > >> not the case, the model would be wrong so you could not label it as a model > >> of the brain. > > > That would require that the model of the brain be closer than > > genetically identical, since identical twins and conjoined twins do > > not always respond the same way to the same inputs. That may not be > > possible, since the epigenetic variation and developmental influences > > may not be knowable or reproducible. It's a 'Boys From Brazil' theory. > > Cool sci-fi, but I don't think we will ever have to worry about > > considering it as a real possibility. We know nothing about what the > > substitution level of the 'same inputs' would be either. Can you say > > that making a brain of a 10 year old would not require 10 years of > > sequential neural imprinting or that the imprinting would be any less > > complex to develop than it would be to create than the world itself? > > Firstly, it is theoretically possible to model the brain arbitrarily > closely, even if technically difficult. Secondly, it is enough for the > purposes of the discussion to model a generic brain, not a particular > brain.
I disagree on both counts. There is no such thing as a generic brain, and any theory which assumes that it's possible to model the brain closely enough to replace does not understand the relation between the brain and mind. I think that I do understand that relation and my understanding suggests that every brain is unique to the point that it may not even be possible to reproduce a single moment of a brain's function, let alone an ongoing mechanism. It's not clear that the brain could even be considered the same thing as itself from day to day. It's more like an electronic cloud of meaty snot that is continuously changing in novel often utterly idiosyncratic ways. > > >> They never said they could know which inputs you could have and they don't > >> have to. They just have to know the transition rule (biochemichal/physical) > >> of each neurons and as the brain respect physics so as the model, and so it > >> will react the same way. > > > Reacting is not experiencing though. A picture of a brain can react > > like a brain, but it doesn't mean there is an experiential correlate > > there. Just because the picture is 3D and has some computation behind > > it instead of just a recording, why would that make it suddenly have > > an experience? > > In the first instance, yes, you might not be sure iif the artificial > brain is a zombie. But the fading qualia thought experiments shows > that if it is a zombie it would allow you to make absurd creatures, > partial zombies (defined as someone who lacks a particular conscious > modality but behaves normally and doesn't realise anything is wrong). Fading qualia is not a problem. Somnambulism, conversion disorders, and synesthesia exist already. Blind people store tactile qualia in their visual cortex. Are blind people absurd creatures because they see through their fingers? > The only way to avoid the partial zombies is if the brain model > replicates consciousness along with function. That statement is much more absurd than the idea of partial zombies. It is to say that the only way to avoid computers without screens is if all computation replicates a monitor with it's function. It's ridiculous and false if you ask me. Unscientific. Lazy. > > >> You do the same mistake with your tv pixel analogy. If I know all the > >> transition rule of *a pixel* according to input... I can build a model of a > >> TV that will *exactly* display the same thing as the real TV for the same > >> inputs without knowing anything about movies/show/whatever... I don't care > >> about movies at that level. They never said that they would explain/predict > >> the input to the tv, just replicate the tv. > > > You have to care about the movies at that level because that's what > > consciousness is in the metaphor. If you don't have an experience of > > watching a movie, then you just have an a-signifying non-pattern of > > unrelated pixels. You need a perceiver, and audience to turn the image > > into something that makes sense. It's like saying that you could write > > a piece of software that could be used as a replacement for a monitor. > > It doesn't matter if you have a video card in the computer and drivers > > to run it, without the actual hardware screen plugged into it there is > > no way for us to see it. A computer does not come with it's own screen > > built into the interior of it's microprocessors - but we do have the > > equivalent of that. Our experience cannot be seen from our neurology, > > you have to already know it's there. Building a model based only on > > neurology doesn't mean that experience comes with it any more than a > > video driver means you don't need a monitor. > > A model of the TV will reproduce the externally observable behaviour > of a TV, given the same inputs. That's what a model is. A model of a > brain would reproduce the externally observable behaviour of a brain. > Whether it would also reproduce the consciousness is a further > question, and the fading qualia thought experiment shows that it > would. > A model of the TV will only reproduce the external behavior of a TV if we already have a TV equivalent to see it on. Can the model reproduce a TV's observable behavior through a non-visual interface? No. Can it reproduce it to an audience who cannot see? No. The model is not capable of reproducing anything by itself, just as a model of a brain cannot produce consciousness without there being something which is capable of experiencing it. There is no reason whatsoever to assume that experience would arise automatically, just as a computer monitor does not arise from a video driver. The fading qualia conjecture is bunk. It arises from the mistaken notion that consciousness is a product of mechanism rather than the subjective correlate of it. Craig -- You received this message because you are subscribed to the Google Groups "Everything List" group. To post to this group, send email to firstname.lastname@example.org. To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.