On Thu, Jul 21, 2011 at 9:44 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
> Since it's not possible to know what the point of view of biological
> neurons would be, we can't rule out the contents of the cell. You
> can't presume to know that behavior is independent of context. If you
> consider the opposite scenario, at what point do you consider a
> microelectronic configuration conscious? How many biological neurons
> does it take added to a computer before it has it's own agenda?

I think you're still missing the point. Forget about consciousness for
the moment and consider only the mechanical aspect of the brain. By
analogy consider a car: we replace parts that wear out with new parts
that function equivalently. If we replace the sparkplugs as long as
the new ones screw in properly and have the right electrical
properties it doesn't matter if they are a different shape or colour.
The proof of this is that car is observed to function normally under
all circumstances. Similarly with the brain, we replace some existing
neurons with modified or artificial neurons that function identically.
No doubt it would be difficult to make such neurons, but *provided*
they can be made and appropriately installed, the behaviour of the
entire brain will be the same, and *therefore* the consciousness will
be the same. Do you agree with this, or not?


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to