John Ku wrote:

By the way, I think this whole tangent was actually started by Richard
misinterpreting Lanier's argument (though quite understandably given
Lanier's vagueness and unclarity). Lanier was not imagining the
amazing coincidence of a genuine computer being implemented in a
rainstorm, i.e. one that is robustly implementing all the right causal
laws and the strong conditionals Chalmers talks about. Rather, he was
imagining the more ordinary and really not very amazing coincidence of
a rainstorm bearing a certain superficial isomorphism to just a trace
of the right kind of computation. He rightly notes that if
functionalism were committed to such a rainstorm being conscious, it
should be rejected. I think this is true whether or not such
rainstorms actually exist or are likely since a correct theory of our
concepts should deliver the right results as the concept is applied to
any genuine possibility. For instance, if someone's ethical theory
delivers the result that it is perfectly permissible to press a button
that would cause all conscious beings to suffer for all eternity, then
it is no legitimate defense to claim that's okay because it's really
unlikely. As I tried to explain, I think Lanier's argument fails
because he doesn't establish that functionalism is committed to the
absurd result that the rainstorms he discusses are conscious or
genuinely implementing computation. If, on the other hand, Lanier were
imagining a rainstorm miraculously implementing real computation (in
the way Chalmers discusses) and somehow thought that was a problem for
functionalism, then of course Richard's reply would roughly be the
correct one.

Oh, I really don't think I made that kind of mistake in interpreting Lanier's argument.

If Lanier was attacking a very *particular* brand of functionalism (the kind that would say isomorphism is everything, so any isomorphism between a rainstorm and a conscious computer, even for just a millisecond, would leave you no option but to say that the rainstorm is conscious), then perhaps I agree with Lanier. That kind of simplistic functionalism is just not going to work.

But I don't think he was narrowing his scope that much, was he? If so, he was attacking a straw man. I just assumed he wasn't doing anything so trivial, but I stand to be corrected if he was. I certainly thought that may of the people who cited Lanier's argument were citing it as a demolition of functionalism in the large.

There are many functionalists who would say that what matters is a functional isomorphism, and that even though we have difficulty at this time saying exactly what we mean by a "functional" isomorphism, nevertheless it is not good enough to simply find any old isomorphism (especially one which holds for only a moment).

I would also point out one other weakness in his argument: in order to get his isomorphism to work, he almost certainly has to allow the hypothetical computer to implement the "rainstorm" at a different level of representation from the "consciousness". It is only if you allow this difference of levels between the two things that the hypothetical machine is guaranteed to be possible. If the two things are suposed to be present at exactly the same level of representation in the machine, then I am fairly sure that the machine is over-constrained and thus we cannot say that such a machine is, in general possible.

But if they happen at different levels, then the argument falls appart for a different reason: you can always make two systems coexist in this way, but that does not mean that they are the same "system". There is no actual isomorphism in this case. This, of course, was Searle's main mistake: understanding of English and Chinese were happening in two different levels, therefore two different systems, and nobody claims that what one system understands, the other must also be "understanding". (Searle's main folly, of course, is that he has never shown any sign of being able to understand this point).



Richard Loosemore

-------------------------------------------
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b
Powered by Listbox: http://www.listbox.com

Reply via email to