On Aug 12, 4:21 am, Stathis Papaioannou <stath...@gmail.com> wrote:
> On 12/08/2011, at 1:06 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
> > Sure, muscles will contract for any old material that can conduct an
> > electric current. A muscle doesn't require a high level conversation
> > with the brain's cells to react. We can move in our sleep when we
> > aren't subjectively conscious of it.
> But can the muscles be made to contract through electrical stimulation in
> such a way that you can have an intelligent conversation with them?
You might be able to have an intelligent conversation about glucose or
tensile strength electronically, but it need not have anything to do
with making them contract. Nervous tissue is a special case of
biological tissue in that it's purpose is to make it's own cellular
experience transparent in favor of refining and telling the stories of
other tissues and their stories of their environment. A muscle cell
isn't necessarily interested in or capable of non-muscular
In a normal person the brain does the complex calculations which
produce intelligible language from the vocal cords.
If my hypothesis is correct, the brain and the vocal chords work
together to some degree. It uses sensory feedback from the vocal
chords in real time to modulate it's motive efforts to speak.
Can the same calculations be done by computer stimulating the vocal
cords or is there something the computer just won't be able to do?
My guess is that a computer would have to be entrained to the real
life vocal chords of the particular person's body in order to get
close to perfect fluidity, and that may require 'cooperation' from the
nervous tissues related to the vocal chords. Absent those, the tissues
themselves would have to be hacked into with artificial neurology.
If so, where will its language deficiencies be, and what is the
specific mathematical problem the brain can solve but the computer
can't? If the computer can't copy human behaviour due to lacking human
consciousness that is equivalent to saying that there are non-
computable mathematics in the brain.
To produce human speech, the computer need not have human
consciousness (awareness of the awarenesses of the human organism as a
whole), it just needs awareness of the larynx and the speech centers
of the brain. If you want the computer to be able to understand the
meaning of what it's saying, then you are talking about replacing the
entire prefrontal cortex, in which case, it depends on what you
replace it with as to the extent to which it's understanding resembles
If you replace the neurological community which handles speech
articulation only, you might be able to do it well enough that we, the
user, can use it (probably would have to be entrained from the
cognitive side as well - the brain would have to discover the implant
and learn how to use it), but that doesn't mean that on the level of
the community of the brain and nervous system there is no difference.
That fact becomes monumentally important when you consider replacing
not just the neurology that you use but the neurology that you
actually identify with personally as you. Even with just the
artificial larynx driver, you may very well be able to tell the
difference in the sound of your own voice, and others may also. It may
feel different to speak, and some unanticipated differences such as
swallowing, clearing your throat, noticing a sore throat before it
gets serious, etc may arise.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at