--- On Mon, 11/17/08, Richard Loosemore <[EMAIL PROTECTED]> wrote: > What I am claiming (and I will make this explicit in a > revision of the paper) is that these notions of > "explanation", "meaning", "solution > to the problem", etc., are pushed to their breaking > point by the problem of consciousness. So it is not that > there is a problem with understanding consciousness itself, > so much as there is a problem with what it means to > *explain* things.
Yes, that is because we are asking the wrong questions. For example: Not: should we do experiments on animals? Instead: will we do experiments on animals? Not: can computers think? Instead: can computers behave in a way that is indistinguishable from human? -- Matt Mahoney, [EMAIL PROTECTED] ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06 Powered by Listbox: http://www.listbox.com
