On 03 Apr 2013, at 23:53, Jesse Mazer wrote:
On Wed, Apr 3, 2013 at 2:42 PM, Craig Weinberg
In a universe of functionalism or comp, I would expect that this
would never happen, as my brain should always prioritize the
information made available by any eye that is open over that of an
eye which is closed.
I don't think the "function" in functionalism is supposed to refer
to utility or purpose. Functionalism as I understand it just refers
to the idea that if you replaced each part of the brain with a
"functionally identical" part, meaning that its input/output
relationship is the same as the original part, then this will result
in no change in conscious experience, regardless of the material
details of how the part produces this input/output relation (a
miniature version of the "Chinese room" thought experiment could
work, for example). It's also self-evident that there should be no
behavioral change, *if* we assume the reductionist idea that the
large-scale behavior of any physical system is determined by the
rules governing the behavior and interactions of each of its
component parts (you would probably dispute this, but the point is
just that this seems to be one of the assumptions of
'functionalism', and of course almost all modern scientific theories
of systems composed of multiple parts work with this assumption).
For example, if you have a tumor which is altering your
consciousness and disrupting some other abilities like speech, that
is obviously not serving any useful function, but "functionalism"
wouldn't claim it should, it would just say that if you replaced the
tumor with an artificial device that affected the surrounding
neurons in exactly the same way, the affected patient wouldn't
notice any subjective difference (likewise with more useful parts of
the brain, of course).
There may of course be different meanings that philosophers have
assigned to the term "functionalism", but I think this is one, and
I'm pretty sure it's part of what "COMP" is taken to mean on this
You are right. Functionalism means that we can substitute a part with
functionally equivalent part. Comp, in the weak sense I use it, means
that functionalism occurs at some description level. Then we can
explain that a machine cannot know for sure its own substitution
level, but it can bet on it, and the physics around him can give some
indication. If the comp physics gives exactly the "usual quantum
mechanics", it could be an evidence that pour substitution level is
given by the Heisenberg uncertainty relations. Note that this is not
the level needed to survive, but to survive in the exact same mental
state. People will accept much higher level brain substitution,
because they will be cheaper, and they will not mind so much loosing
some memories or even personality treats.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
To post to this group, send email to email@example.com.
Visit this group at http://groups.google.com/group/everything-list?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.